Mar 10 00:06:12 crc systemd[1]: Starting Kubernetes Kubelet... Mar 10 00:06:12 crc restorecon[4683]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:12 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:13 crc restorecon[4683]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 00:06:13 crc restorecon[4683]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 10 00:06:14 crc kubenswrapper[4906]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 00:06:14 crc kubenswrapper[4906]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 10 00:06:14 crc kubenswrapper[4906]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 00:06:14 crc kubenswrapper[4906]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 00:06:14 crc kubenswrapper[4906]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 10 00:06:14 crc kubenswrapper[4906]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.298892 4906 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304377 4906 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304411 4906 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304420 4906 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304442 4906 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304450 4906 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304458 4906 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304466 4906 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304474 4906 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304481 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304489 4906 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304497 4906 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304505 4906 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304513 4906 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304521 4906 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304528 4906 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304539 4906 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304550 4906 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304559 4906 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304568 4906 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304576 4906 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304584 4906 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304592 4906 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304600 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304608 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304616 4906 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304624 4906 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304632 4906 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304665 4906 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304673 4906 feature_gate.go:330] unrecognized feature gate: Example Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304681 4906 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304689 4906 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304697 4906 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304704 4906 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304712 4906 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304720 4906 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304727 4906 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304735 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304742 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304752 4906 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304759 4906 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304767 4906 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304775 4906 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304782 4906 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304792 4906 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304800 4906 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304808 4906 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304815 4906 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304823 4906 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304831 4906 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304838 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304847 4906 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304855 4906 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304863 4906 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304873 4906 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304883 4906 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304891 4906 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304899 4906 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304906 4906 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304914 4906 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304922 4906 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304930 4906 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304938 4906 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304946 4906 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304953 4906 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304961 4906 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304968 4906 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304980 4906 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.304990 4906 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.305000 4906 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.305011 4906 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.305019 4906 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305751 4906 flags.go:64] FLAG: --address="0.0.0.0" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305776 4906 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305793 4906 flags.go:64] FLAG: --anonymous-auth="true" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305806 4906 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305818 4906 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305827 4906 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305839 4906 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305849 4906 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305859 4906 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305868 4906 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305878 4906 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305887 4906 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305896 4906 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305905 4906 flags.go:64] FLAG: --cgroup-root="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305914 4906 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305923 4906 flags.go:64] FLAG: --client-ca-file="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305932 4906 flags.go:64] FLAG: --cloud-config="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305940 4906 flags.go:64] FLAG: --cloud-provider="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305949 4906 flags.go:64] FLAG: --cluster-dns="[]" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305959 4906 flags.go:64] FLAG: --cluster-domain="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305968 4906 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305977 4906 flags.go:64] FLAG: --config-dir="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305986 4906 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.305996 4906 flags.go:64] FLAG: --container-log-max-files="5" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306007 4906 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306016 4906 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306025 4906 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306034 4906 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306043 4906 flags.go:64] FLAG: --contention-profiling="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306053 4906 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306063 4906 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306072 4906 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306081 4906 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306092 4906 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306100 4906 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306110 4906 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306118 4906 flags.go:64] FLAG: --enable-load-reader="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306127 4906 flags.go:64] FLAG: --enable-server="true" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306137 4906 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306150 4906 flags.go:64] FLAG: --event-burst="100" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306160 4906 flags.go:64] FLAG: --event-qps="50" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306168 4906 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306177 4906 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306187 4906 flags.go:64] FLAG: --eviction-hard="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306197 4906 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306207 4906 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306216 4906 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306225 4906 flags.go:64] FLAG: --eviction-soft="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306234 4906 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306243 4906 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306252 4906 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306261 4906 flags.go:64] FLAG: --experimental-mounter-path="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306270 4906 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306278 4906 flags.go:64] FLAG: --fail-swap-on="true" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306287 4906 flags.go:64] FLAG: --feature-gates="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306297 4906 flags.go:64] FLAG: --file-check-frequency="20s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306307 4906 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306316 4906 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306326 4906 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306335 4906 flags.go:64] FLAG: --healthz-port="10248" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306344 4906 flags.go:64] FLAG: --help="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306353 4906 flags.go:64] FLAG: --hostname-override="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306363 4906 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306373 4906 flags.go:64] FLAG: --http-check-frequency="20s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306381 4906 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306390 4906 flags.go:64] FLAG: --image-credential-provider-config="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306400 4906 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306408 4906 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306417 4906 flags.go:64] FLAG: --image-service-endpoint="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306426 4906 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306435 4906 flags.go:64] FLAG: --kube-api-burst="100" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306444 4906 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306453 4906 flags.go:64] FLAG: --kube-api-qps="50" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306462 4906 flags.go:64] FLAG: --kube-reserved="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306471 4906 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306480 4906 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306490 4906 flags.go:64] FLAG: --kubelet-cgroups="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306499 4906 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306507 4906 flags.go:64] FLAG: --lock-file="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306516 4906 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306525 4906 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306534 4906 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306547 4906 flags.go:64] FLAG: --log-json-split-stream="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306555 4906 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306564 4906 flags.go:64] FLAG: --log-text-split-stream="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306573 4906 flags.go:64] FLAG: --logging-format="text" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306582 4906 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306591 4906 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306600 4906 flags.go:64] FLAG: --manifest-url="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306608 4906 flags.go:64] FLAG: --manifest-url-header="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306620 4906 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306629 4906 flags.go:64] FLAG: --max-open-files="1000000" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306663 4906 flags.go:64] FLAG: --max-pods="110" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306672 4906 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306682 4906 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306692 4906 flags.go:64] FLAG: --memory-manager-policy="None" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306700 4906 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306710 4906 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306720 4906 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306729 4906 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306749 4906 flags.go:64] FLAG: --node-status-max-images="50" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306758 4906 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306767 4906 flags.go:64] FLAG: --oom-score-adj="-999" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306776 4906 flags.go:64] FLAG: --pod-cidr="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306784 4906 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306798 4906 flags.go:64] FLAG: --pod-manifest-path="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306807 4906 flags.go:64] FLAG: --pod-max-pids="-1" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306816 4906 flags.go:64] FLAG: --pods-per-core="0" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306825 4906 flags.go:64] FLAG: --port="10250" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306834 4906 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306843 4906 flags.go:64] FLAG: --provider-id="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306853 4906 flags.go:64] FLAG: --qos-reserved="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306861 4906 flags.go:64] FLAG: --read-only-port="10255" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306871 4906 flags.go:64] FLAG: --register-node="true" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306879 4906 flags.go:64] FLAG: --register-schedulable="true" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306888 4906 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306902 4906 flags.go:64] FLAG: --registry-burst="10" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306911 4906 flags.go:64] FLAG: --registry-qps="5" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306920 4906 flags.go:64] FLAG: --reserved-cpus="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306929 4906 flags.go:64] FLAG: --reserved-memory="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306940 4906 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306949 4906 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306969 4906 flags.go:64] FLAG: --rotate-certificates="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306978 4906 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306987 4906 flags.go:64] FLAG: --runonce="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.306996 4906 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307007 4906 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307017 4906 flags.go:64] FLAG: --seccomp-default="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307027 4906 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307036 4906 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307045 4906 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307055 4906 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307064 4906 flags.go:64] FLAG: --storage-driver-password="root" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307073 4906 flags.go:64] FLAG: --storage-driver-secure="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307082 4906 flags.go:64] FLAG: --storage-driver-table="stats" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307091 4906 flags.go:64] FLAG: --storage-driver-user="root" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307099 4906 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307109 4906 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307118 4906 flags.go:64] FLAG: --system-cgroups="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307126 4906 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307141 4906 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307149 4906 flags.go:64] FLAG: --tls-cert-file="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307158 4906 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307169 4906 flags.go:64] FLAG: --tls-min-version="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307178 4906 flags.go:64] FLAG: --tls-private-key-file="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307187 4906 flags.go:64] FLAG: --topology-manager-policy="none" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307195 4906 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307206 4906 flags.go:64] FLAG: --topology-manager-scope="container" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307214 4906 flags.go:64] FLAG: --v="2" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307226 4906 flags.go:64] FLAG: --version="false" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307238 4906 flags.go:64] FLAG: --vmodule="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307248 4906 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.307258 4906 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307500 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307511 4906 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307519 4906 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307527 4906 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307535 4906 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307543 4906 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307552 4906 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307560 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307568 4906 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307575 4906 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307583 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307590 4906 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307599 4906 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307607 4906 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307614 4906 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307621 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307629 4906 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307664 4906 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307675 4906 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307684 4906 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307694 4906 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307703 4906 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307712 4906 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307722 4906 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307730 4906 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307738 4906 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307746 4906 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307755 4906 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307764 4906 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307773 4906 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307783 4906 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307793 4906 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307801 4906 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307811 4906 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307821 4906 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307829 4906 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307837 4906 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307845 4906 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307853 4906 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307862 4906 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307870 4906 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307878 4906 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307886 4906 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307893 4906 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307902 4906 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307909 4906 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307917 4906 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307924 4906 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307932 4906 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307940 4906 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307948 4906 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307956 4906 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307964 4906 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307972 4906 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307980 4906 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307987 4906 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.307995 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.308003 4906 feature_gate.go:330] unrecognized feature gate: Example Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.308011 4906 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.308019 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.308027 4906 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.308034 4906 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.308042 4906 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.308050 4906 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.308057 4906 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.308065 4906 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.308074 4906 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.308081 4906 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.308092 4906 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.308102 4906 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.308112 4906 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.308136 4906 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.323912 4906 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.323952 4906 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324067 4906 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324079 4906 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324084 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324089 4906 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324094 4906 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324098 4906 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324104 4906 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324110 4906 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324116 4906 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324126 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324132 4906 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324138 4906 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324143 4906 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324148 4906 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324153 4906 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324159 4906 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324165 4906 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324170 4906 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324174 4906 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324180 4906 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324185 4906 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324204 4906 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324209 4906 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324213 4906 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324218 4906 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324223 4906 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324227 4906 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324231 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324241 4906 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324246 4906 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324251 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324255 4906 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324260 4906 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324265 4906 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324269 4906 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324274 4906 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324279 4906 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324284 4906 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324289 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324294 4906 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324299 4906 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324304 4906 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324309 4906 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324314 4906 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324319 4906 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324324 4906 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324329 4906 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324333 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324339 4906 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324344 4906 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324349 4906 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324353 4906 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324358 4906 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324362 4906 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324368 4906 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324374 4906 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324379 4906 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324384 4906 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324389 4906 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324394 4906 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324398 4906 feature_gate.go:330] unrecognized feature gate: Example Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324405 4906 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324410 4906 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324415 4906 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324419 4906 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324424 4906 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324429 4906 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324434 4906 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324439 4906 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324444 4906 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324449 4906 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.324459 4906 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324675 4906 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324689 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324696 4906 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324701 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324708 4906 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324713 4906 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324718 4906 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324723 4906 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324728 4906 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324734 4906 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324739 4906 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324744 4906 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324748 4906 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324754 4906 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324759 4906 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324764 4906 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324769 4906 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324773 4906 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324778 4906 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324782 4906 feature_gate.go:330] unrecognized feature gate: Example Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324787 4906 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324792 4906 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324797 4906 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324802 4906 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324806 4906 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324811 4906 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324818 4906 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324826 4906 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324832 4906 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324838 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324843 4906 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324848 4906 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324853 4906 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324860 4906 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324865 4906 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324870 4906 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324875 4906 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324881 4906 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324887 4906 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324892 4906 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324898 4906 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324904 4906 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324912 4906 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324917 4906 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324922 4906 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324926 4906 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324931 4906 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324936 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324941 4906 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324946 4906 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324950 4906 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324955 4906 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324960 4906 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324964 4906 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324969 4906 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324975 4906 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324981 4906 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324986 4906 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324991 4906 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.324997 4906 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.325001 4906 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.325007 4906 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.325013 4906 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.325018 4906 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.325023 4906 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.325028 4906 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.325032 4906 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.325037 4906 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.325043 4906 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.325049 4906 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.325054 4906 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.325061 4906 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.326104 4906 server.go:940] "Client rotation is on, will bootstrap in background" Mar 10 00:06:14 crc kubenswrapper[4906]: E0310 00:06:14.330881 4906 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.337587 4906 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.337722 4906 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.339348 4906 server.go:997] "Starting client certificate rotation" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.339378 4906 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.340830 4906 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.365832 4906 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.367931 4906 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 00:06:14 crc kubenswrapper[4906]: E0310 00:06:14.369372 4906 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.396723 4906 log.go:25] "Validated CRI v1 runtime API" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.440257 4906 log.go:25] "Validated CRI v1 image API" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.443389 4906 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.451532 4906 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-10-00-01-10-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.451598 4906 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.478667 4906 manager.go:217] Machine: {Timestamp:2026-03-10 00:06:14.475484361 +0000 UTC m=+0.623379503 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:5a964b87-c4f2-4bba-95e1-b2c12e6316ae BootID:2a60179e-98a5-4d7f-9dd0-5aef84f37492 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ba:97:e5 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ba:97:e5 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:4e:c5:92 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:5c:1b:bd Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:57:64:e7 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ae:e5:f8 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:2e:7f:63:67:c1:f8 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:9a:87:79:75:cb:77 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.479001 4906 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.479387 4906 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.479987 4906 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.480222 4906 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.480271 4906 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.480526 4906 topology_manager.go:138] "Creating topology manager with none policy" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.480539 4906 container_manager_linux.go:303] "Creating device plugin manager" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.481089 4906 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.481126 4906 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.481922 4906 state_mem.go:36] "Initialized new in-memory state store" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.482843 4906 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.487839 4906 kubelet.go:418] "Attempting to sync node with API server" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.487869 4906 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.487903 4906 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.487918 4906 kubelet.go:324] "Adding apiserver pod source" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.487933 4906 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.492966 4906 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.494161 4906 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.495158 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Mar 10 00:06:14 crc kubenswrapper[4906]: E0310 00:06:14.495285 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.495586 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Mar 10 00:06:14 crc kubenswrapper[4906]: E0310 00:06:14.495796 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.497446 4906 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.498952 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.498987 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.498999 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.499010 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.499027 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.499038 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.499052 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.499148 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.499164 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.499177 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.499201 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.499216 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.500362 4906 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.501420 4906 server.go:1280] "Started kubelet" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.502102 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.503117 4906 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.503107 4906 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.503692 4906 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 10 00:06:14 crc systemd[1]: Started Kubernetes Kubelet. Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.504003 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.504047 4906 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.505049 4906 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.505074 4906 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 10 00:06:14 crc kubenswrapper[4906]: E0310 00:06:14.505336 4906 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.505352 4906 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.512404 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Mar 10 00:06:14 crc kubenswrapper[4906]: E0310 00:06:14.512537 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.513472 4906 factory.go:55] Registering systemd factory Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.514289 4906 factory.go:221] Registration of the systemd container factory successfully Mar 10 00:06:14 crc kubenswrapper[4906]: E0310 00:06:14.514343 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="200ms" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.514927 4906 factory.go:153] Registering CRI-O factory Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.514968 4906 factory.go:221] Registration of the crio container factory successfully Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.515042 4906 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.515077 4906 factory.go:103] Registering Raw factory Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.515103 4906 manager.go:1196] Started watching for new ooms in manager Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.515968 4906 server.go:460] "Adding debug handlers to kubelet server" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.516026 4906 manager.go:319] Starting recovery of all containers Mar 10 00:06:14 crc kubenswrapper[4906]: E0310 00:06:14.517724 4906 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.179:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b521fb8fb8425 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.501377061 +0000 UTC m=+0.649272213,LastTimestamp:2026-03-10 00:06:14.501377061 +0000 UTC m=+0.649272213,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522070 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522240 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522254 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522265 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522281 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522292 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522328 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522340 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522353 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522364 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522374 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522417 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522447 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522465 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522481 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522494 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522506 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522519 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522531 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522543 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522558 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522573 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522622 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522652 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522667 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522680 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522693 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522706 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522721 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522736 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522749 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522760 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522773 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522785 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522798 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522811 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522827 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522839 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522876 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522891 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522903 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522915 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522927 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522939 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522955 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522969 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522982 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.522997 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523009 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523023 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523035 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523049 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523096 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523114 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523127 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523142 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523157 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523170 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523183 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523197 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523213 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523226 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523239 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523251 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523266 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523304 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523318 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523333 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523347 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523361 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.523376 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.527759 4906 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.527792 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.527808 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.527822 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.527837 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.527852 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.527890 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.527902 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.527914 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.527925 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.527943 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.527959 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.527974 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.527987 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.527999 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528013 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528026 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528037 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528049 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528061 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528073 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528089 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528113 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528125 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528137 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528149 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528162 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528176 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528188 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528226 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528237 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528248 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528259 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528272 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528333 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528350 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528363 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528614 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528629 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528663 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528673 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528708 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528718 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528773 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528785 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528847 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528902 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528912 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528922 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528932 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528942 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528953 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.528963 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529014 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529023 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529032 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529042 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529054 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529062 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529073 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529082 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529115 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529125 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529135 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529144 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529156 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529167 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529175 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529185 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529222 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529232 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529242 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529252 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529262 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529270 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529279 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529290 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529325 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529333 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529342 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529352 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529433 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529445 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529455 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529466 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529525 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529536 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529547 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529559 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529570 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529581 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529593 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529602 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529651 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529662 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529671 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529679 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529689 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529698 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529733 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529744 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529830 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529905 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529916 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529924 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529932 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529945 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529953 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529962 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.529994 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530004 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530013 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530021 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530029 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530038 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530047 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530055 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530092 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530123 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530175 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530183 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530191 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530200 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530208 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530218 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530248 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530257 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530266 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530276 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530356 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530367 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530401 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530457 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530497 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530552 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530562 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530571 4906 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530579 4906 reconstruct.go:97] "Volume reconstruction finished" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.530587 4906 reconciler.go:26] "Reconciler: start to sync state" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.535259 4906 manager.go:324] Recovery completed Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.544723 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.546983 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.547061 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.547079 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.550219 4906 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.550238 4906 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.550259 4906 state_mem.go:36] "Initialized new in-memory state store" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.572978 4906 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.573493 4906 policy_none.go:49] "None policy: Start" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.575299 4906 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.575375 4906 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.575435 4906 kubelet.go:2335] "Starting kubelet main sync loop" Mar 10 00:06:14 crc kubenswrapper[4906]: E0310 00:06:14.575512 4906 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.575859 4906 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.575908 4906 state_mem.go:35] "Initializing new in-memory state store" Mar 10 00:06:14 crc kubenswrapper[4906]: W0310 00:06:14.576529 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Mar 10 00:06:14 crc kubenswrapper[4906]: E0310 00:06:14.576599 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:14 crc kubenswrapper[4906]: E0310 00:06:14.605836 4906 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.655488 4906 manager.go:334] "Starting Device Plugin manager" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.655543 4906 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.655557 4906 server.go:79] "Starting device plugin registration server" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.656054 4906 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.656071 4906 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.656505 4906 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.656581 4906 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.656592 4906 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 10 00:06:14 crc kubenswrapper[4906]: E0310 00:06:14.672485 4906 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.675927 4906 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.676145 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.677717 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.677791 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.677812 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.678083 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.678303 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.678385 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.679520 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.679752 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.679787 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.679936 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.679945 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.679993 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.680051 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.680105 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.680152 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.680957 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.680999 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.681087 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.681095 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.681122 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.681133 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.681285 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.681330 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.681368 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.682114 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.682139 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.682152 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.682180 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.682194 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.682201 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.682309 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.682413 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.682448 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.683098 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.683136 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.683151 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.683406 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.683441 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.684076 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.684131 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.684180 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.685188 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.685269 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.685289 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:14 crc kubenswrapper[4906]: E0310 00:06:14.715468 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="400ms" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.733484 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.733544 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.733612 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.733795 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.733907 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.733991 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.734036 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.734068 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.734102 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.734136 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.734180 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.734232 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.734275 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.734336 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.734354 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.756344 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.758086 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.758139 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.758180 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.758221 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:14 crc kubenswrapper[4906]: E0310 00:06:14.758961 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.179:6443: connect: connection refused" node="crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.835454 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.835725 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.835886 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.836047 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.836212 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.835935 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.836219 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.836365 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.836403 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.836430 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.836481 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.836467 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.836503 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.836495 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.837162 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.837224 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.837298 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.837330 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.837356 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.837346 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.837422 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.837437 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.837394 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.837554 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.837658 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.837628 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.837727 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.837772 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.840535 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.840583 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.959545 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.960865 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.960912 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.960925 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:14 crc kubenswrapper[4906]: I0310 00:06:14.960958 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:14 crc kubenswrapper[4906]: E0310 00:06:14.961433 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.179:6443: connect: connection refused" node="crc" Mar 10 00:06:15 crc kubenswrapper[4906]: I0310 00:06:15.023465 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:15 crc kubenswrapper[4906]: I0310 00:06:15.037835 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:15 crc kubenswrapper[4906]: I0310 00:06:15.071273 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:15 crc kubenswrapper[4906]: W0310 00:06:15.072933 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-a7009900db272b6936aef053ca660f00efc551536c9b4ca64482b8a36fdd87c7 WatchSource:0}: Error finding container a7009900db272b6936aef053ca660f00efc551536c9b4ca64482b8a36fdd87c7: Status 404 returned error can't find the container with id a7009900db272b6936aef053ca660f00efc551536c9b4ca64482b8a36fdd87c7 Mar 10 00:06:15 crc kubenswrapper[4906]: W0310 00:06:15.079510 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-aaa85fa164dea0b9a4dfb71b5f6eb67be3d9fa6be275d0b8355b232e723c3136 WatchSource:0}: Error finding container aaa85fa164dea0b9a4dfb71b5f6eb67be3d9fa6be275d0b8355b232e723c3136: Status 404 returned error can't find the container with id aaa85fa164dea0b9a4dfb71b5f6eb67be3d9fa6be275d0b8355b232e723c3136 Mar 10 00:06:15 crc kubenswrapper[4906]: W0310 00:06:15.094044 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-7d9a3aa6dd4c1971122360f03ee638ca52fb85fd2d74923add8eac184da69120 WatchSource:0}: Error finding container 7d9a3aa6dd4c1971122360f03ee638ca52fb85fd2d74923add8eac184da69120: Status 404 returned error can't find the container with id 7d9a3aa6dd4c1971122360f03ee638ca52fb85fd2d74923add8eac184da69120 Mar 10 00:06:15 crc kubenswrapper[4906]: I0310 00:06:15.099595 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 00:06:15 crc kubenswrapper[4906]: I0310 00:06:15.108504 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 00:06:15 crc kubenswrapper[4906]: E0310 00:06:15.117218 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="800ms" Mar 10 00:06:15 crc kubenswrapper[4906]: W0310 00:06:15.122019 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-c0b28682b34c2bb677cc83bab43e3cf0d493e2c8098e9799ab59a346d56995f8 WatchSource:0}: Error finding container c0b28682b34c2bb677cc83bab43e3cf0d493e2c8098e9799ab59a346d56995f8: Status 404 returned error can't find the container with id c0b28682b34c2bb677cc83bab43e3cf0d493e2c8098e9799ab59a346d56995f8 Mar 10 00:06:15 crc kubenswrapper[4906]: W0310 00:06:15.131111 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-58251457514ec6400b27635b7d092e2b89705607286d3c8f0ebbdf581115c967 WatchSource:0}: Error finding container 58251457514ec6400b27635b7d092e2b89705607286d3c8f0ebbdf581115c967: Status 404 returned error can't find the container with id 58251457514ec6400b27635b7d092e2b89705607286d3c8f0ebbdf581115c967 Mar 10 00:06:15 crc kubenswrapper[4906]: W0310 00:06:15.347525 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Mar 10 00:06:15 crc kubenswrapper[4906]: E0310 00:06:15.347618 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:15 crc kubenswrapper[4906]: I0310 00:06:15.361975 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:15 crc kubenswrapper[4906]: I0310 00:06:15.363952 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:15 crc kubenswrapper[4906]: I0310 00:06:15.363998 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:15 crc kubenswrapper[4906]: I0310 00:06:15.364010 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:15 crc kubenswrapper[4906]: I0310 00:06:15.364041 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:15 crc kubenswrapper[4906]: E0310 00:06:15.364476 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.179:6443: connect: connection refused" node="crc" Mar 10 00:06:15 crc kubenswrapper[4906]: I0310 00:06:15.503780 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Mar 10 00:06:15 crc kubenswrapper[4906]: W0310 00:06:15.587202 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Mar 10 00:06:15 crc kubenswrapper[4906]: E0310 00:06:15.587291 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:15 crc kubenswrapper[4906]: I0310 00:06:15.590526 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"58251457514ec6400b27635b7d092e2b89705607286d3c8f0ebbdf581115c967"} Mar 10 00:06:15 crc kubenswrapper[4906]: I0310 00:06:15.591561 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c0b28682b34c2bb677cc83bab43e3cf0d493e2c8098e9799ab59a346d56995f8"} Mar 10 00:06:15 crc kubenswrapper[4906]: I0310 00:06:15.592561 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7d9a3aa6dd4c1971122360f03ee638ca52fb85fd2d74923add8eac184da69120"} Mar 10 00:06:15 crc kubenswrapper[4906]: I0310 00:06:15.593410 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"aaa85fa164dea0b9a4dfb71b5f6eb67be3d9fa6be275d0b8355b232e723c3136"} Mar 10 00:06:15 crc kubenswrapper[4906]: I0310 00:06:15.594232 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a7009900db272b6936aef053ca660f00efc551536c9b4ca64482b8a36fdd87c7"} Mar 10 00:06:15 crc kubenswrapper[4906]: W0310 00:06:15.722256 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Mar 10 00:06:15 crc kubenswrapper[4906]: E0310 00:06:15.722923 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:15 crc kubenswrapper[4906]: W0310 00:06:15.871121 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Mar 10 00:06:15 crc kubenswrapper[4906]: E0310 00:06:15.871207 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:15 crc kubenswrapper[4906]: E0310 00:06:15.918518 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="1.6s" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.165345 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.166998 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.167212 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.167360 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.167506 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:16 crc kubenswrapper[4906]: E0310 00:06:16.168259 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.179:6443: connect: connection refused" node="crc" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.378758 4906 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 00:06:16 crc kubenswrapper[4906]: E0310 00:06:16.380181 4906 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.503704 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.600716 4906 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069" exitCode=0 Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.600821 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069"} Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.600882 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.602255 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.602298 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.602307 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.603294 4906 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15" exitCode=0 Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.603357 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15"} Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.603388 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.604452 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.604482 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.604491 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.608647 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3090d401f5153e72de6163581cf256a48f9e2aed39a2a1ab1911f0322fc2fd4c"} Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.608694 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"acd01291605de2fa2a88cfaff323d3a654acbaff0550f4faa872560a04ac582b"} Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.608732 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.608737 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe"} Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.608865 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2"} Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.610280 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.610325 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.610342 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.612007 4906 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964" exitCode=0 Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.612109 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964"} Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.612275 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.614015 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.614075 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.614096 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.617871 4906 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a" exitCode=0 Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.617915 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a"} Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.618045 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.619355 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.619470 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.619477 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.619495 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.622008 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.622058 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:16 crc kubenswrapper[4906]: I0310 00:06:16.622078 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.503306 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Mar 10 00:06:17 crc kubenswrapper[4906]: E0310 00:06:17.519893 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="3.2s" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.610757 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:17 crc kubenswrapper[4906]: W0310 00:06:17.611778 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Mar 10 00:06:17 crc kubenswrapper[4906]: E0310 00:06:17.611876 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.622095 4906 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b" exitCode=0 Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.622183 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b"} Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.622202 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.623010 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.623044 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.623058 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.625746 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623"} Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.625825 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.626869 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.626908 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.626921 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.629044 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0eca935bc2a1af8e0157cb652fecdf524d319c037423de917084bc258eb00317"} Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.629070 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"159a9ac970d599fe117be8d5340e4ab61b723d5765dc14fca8834807c38ead55"} Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.629083 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b0eb883e8f9f1bea4366cef5df73fd7a58ca339c0a3460cfccb92085380013dc"} Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.629144 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.629927 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.629954 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.629965 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.631955 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.631938 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41"} Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.632098 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435"} Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.632117 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2"} Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.632128 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229"} Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.632537 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.632583 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.632603 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:17 crc kubenswrapper[4906]: W0310 00:06:17.656432 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Mar 10 00:06:17 crc kubenswrapper[4906]: E0310 00:06:17.656574 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.768347 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.770805 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.770862 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.770874 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:17 crc kubenswrapper[4906]: I0310 00:06:17.770910 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:17 crc kubenswrapper[4906]: E0310 00:06:17.772001 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.179:6443: connect: connection refused" node="crc" Mar 10 00:06:17 crc kubenswrapper[4906]: W0310 00:06:17.812215 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.179:6443: connect: connection refused Mar 10 00:06:17 crc kubenswrapper[4906]: E0310 00:06:17.812297 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.179:6443: connect: connection refused" logger="UnhandledError" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.639623 4906 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463" exitCode=0 Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.639716 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463"} Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.639840 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.640593 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.640620 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.640628 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.652377 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.653225 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.653840 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e49d2aa127c4c53cce4490ee27e0fe684c8e2dd90c4a65e07c4262b75bdc17e1"} Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.653990 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.654625 4906 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.654787 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.655079 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.655111 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.655119 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.655145 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.655182 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.655197 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.655768 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.655785 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.655793 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.657198 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.657224 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.657233 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:18 crc kubenswrapper[4906]: I0310 00:06:18.861731 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:19 crc kubenswrapper[4906]: I0310 00:06:19.537485 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:19 crc kubenswrapper[4906]: I0310 00:06:19.659883 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191"} Mar 10 00:06:19 crc kubenswrapper[4906]: I0310 00:06:19.659974 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d"} Mar 10 00:06:19 crc kubenswrapper[4906]: I0310 00:06:19.659982 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:19 crc kubenswrapper[4906]: I0310 00:06:19.659997 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4"} Mar 10 00:06:19 crc kubenswrapper[4906]: I0310 00:06:19.660015 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa"} Mar 10 00:06:19 crc kubenswrapper[4906]: I0310 00:06:19.660988 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:19 crc kubenswrapper[4906]: I0310 00:06:19.661017 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:19 crc kubenswrapper[4906]: I0310 00:06:19.661028 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:19 crc kubenswrapper[4906]: I0310 00:06:19.703764 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:19 crc kubenswrapper[4906]: I0310 00:06:19.703961 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:19 crc kubenswrapper[4906]: I0310 00:06:19.705601 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:19 crc kubenswrapper[4906]: I0310 00:06:19.705670 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:19 crc kubenswrapper[4906]: I0310 00:06:19.705692 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:19 crc kubenswrapper[4906]: I0310 00:06:19.991045 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:20 crc kubenswrapper[4906]: I0310 00:06:20.570998 4906 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 00:06:20 crc kubenswrapper[4906]: I0310 00:06:20.668689 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae"} Mar 10 00:06:20 crc kubenswrapper[4906]: I0310 00:06:20.668806 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:20 crc kubenswrapper[4906]: I0310 00:06:20.668806 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:20 crc kubenswrapper[4906]: I0310 00:06:20.670107 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:20 crc kubenswrapper[4906]: I0310 00:06:20.670138 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:20 crc kubenswrapper[4906]: I0310 00:06:20.670147 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:20 crc kubenswrapper[4906]: I0310 00:06:20.670382 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:20 crc kubenswrapper[4906]: I0310 00:06:20.670422 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:20 crc kubenswrapper[4906]: I0310 00:06:20.670440 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:20 crc kubenswrapper[4906]: I0310 00:06:20.972553 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:20 crc kubenswrapper[4906]: I0310 00:06:20.974427 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:20 crc kubenswrapper[4906]: I0310 00:06:20.974479 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:20 crc kubenswrapper[4906]: I0310 00:06:20.974496 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:20 crc kubenswrapper[4906]: I0310 00:06:20.974535 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.049956 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.050197 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.051759 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.051827 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.051845 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.672828 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.673121 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.674415 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.674484 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.674501 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.675135 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.675203 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.675220 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.844554 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.844871 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.846843 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.846910 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.846933 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:21 crc kubenswrapper[4906]: I0310 00:06:21.858484 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:22 crc kubenswrapper[4906]: I0310 00:06:22.417762 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 10 00:06:22 crc kubenswrapper[4906]: I0310 00:06:22.676225 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:22 crc kubenswrapper[4906]: I0310 00:06:22.676256 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:22 crc kubenswrapper[4906]: I0310 00:06:22.677838 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:22 crc kubenswrapper[4906]: I0310 00:06:22.677888 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:22 crc kubenswrapper[4906]: I0310 00:06:22.677906 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:22 crc kubenswrapper[4906]: I0310 00:06:22.678220 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:22 crc kubenswrapper[4906]: I0310 00:06:22.678261 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:22 crc kubenswrapper[4906]: I0310 00:06:22.678278 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:22 crc kubenswrapper[4906]: I0310 00:06:22.779161 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:06:22 crc kubenswrapper[4906]: I0310 00:06:22.779785 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:22 crc kubenswrapper[4906]: I0310 00:06:22.781608 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:22 crc kubenswrapper[4906]: I0310 00:06:22.781708 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:22 crc kubenswrapper[4906]: I0310 00:06:22.781729 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:23 crc kubenswrapper[4906]: I0310 00:06:23.064453 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 10 00:06:23 crc kubenswrapper[4906]: I0310 00:06:23.679394 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:23 crc kubenswrapper[4906]: I0310 00:06:23.680432 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:23 crc kubenswrapper[4906]: I0310 00:06:23.680501 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:23 crc kubenswrapper[4906]: I0310 00:06:23.680519 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:24 crc kubenswrapper[4906]: I0310 00:06:24.050474 4906 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:06:24 crc kubenswrapper[4906]: I0310 00:06:24.050762 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:06:24 crc kubenswrapper[4906]: E0310 00:06:24.673681 4906 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:06:27 crc kubenswrapper[4906]: I0310 00:06:27.618016 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:27 crc kubenswrapper[4906]: I0310 00:06:27.618195 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:27 crc kubenswrapper[4906]: I0310 00:06:27.619877 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:27 crc kubenswrapper[4906]: I0310 00:06:27.619958 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:27 crc kubenswrapper[4906]: I0310 00:06:27.619977 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:28 crc kubenswrapper[4906]: I0310 00:06:28.503262 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 10 00:06:28 crc kubenswrapper[4906]: E0310 00:06:28.578388 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:28Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 00:06:28 crc kubenswrapper[4906]: E0310 00:06:28.578998 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:28Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 10 00:06:28 crc kubenswrapper[4906]: W0310 00:06:28.586765 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:28Z is after 2026-02-23T05:33:13Z Mar 10 00:06:28 crc kubenswrapper[4906]: E0310 00:06:28.586879 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:28 crc kubenswrapper[4906]: E0310 00:06:28.587414 4906 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:28 crc kubenswrapper[4906]: I0310 00:06:28.587963 4906 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 00:06:28 crc kubenswrapper[4906]: I0310 00:06:28.588040 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 00:06:28 crc kubenswrapper[4906]: E0310 00:06:28.590109 4906 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:28Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b521fb8fb8425 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.501377061 +0000 UTC m=+0.649272213,LastTimestamp:2026-03-10 00:06:14.501377061 +0000 UTC m=+0.649272213,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:28 crc kubenswrapper[4906]: W0310 00:06:28.593097 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:28Z is after 2026-02-23T05:33:13Z Mar 10 00:06:28 crc kubenswrapper[4906]: E0310 00:06:28.593170 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:28 crc kubenswrapper[4906]: W0310 00:06:28.593954 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:28Z is after 2026-02-23T05:33:13Z Mar 10 00:06:28 crc kubenswrapper[4906]: E0310 00:06:28.594039 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:28 crc kubenswrapper[4906]: I0310 00:06:28.595218 4906 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 00:06:28 crc kubenswrapper[4906]: I0310 00:06:28.595268 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 00:06:28 crc kubenswrapper[4906]: W0310 00:06:28.600967 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:28Z is after 2026-02-23T05:33:13Z Mar 10 00:06:28 crc kubenswrapper[4906]: E0310 00:06:28.601032 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:28 crc kubenswrapper[4906]: I0310 00:06:28.868646 4906 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]log ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]etcd ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/generic-apiserver-start-informers ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/priority-and-fairness-filter ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/start-apiextensions-informers ok Mar 10 00:06:28 crc kubenswrapper[4906]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 10 00:06:28 crc kubenswrapper[4906]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/start-system-namespaces-controller ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 10 00:06:28 crc kubenswrapper[4906]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 10 00:06:28 crc kubenswrapper[4906]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/bootstrap-controller ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/start-kube-aggregator-informers ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/apiservice-registration-controller ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/apiservice-discovery-controller ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]autoregister-completion ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/apiservice-openapi-controller ok Mar 10 00:06:28 crc kubenswrapper[4906]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 10 00:06:28 crc kubenswrapper[4906]: livez check failed Mar 10 00:06:28 crc kubenswrapper[4906]: I0310 00:06:28.868730 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:06:29 crc kubenswrapper[4906]: I0310 00:06:29.180538 4906 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51976->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 10 00:06:29 crc kubenswrapper[4906]: I0310 00:06:29.180613 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51976->192.168.126.11:17697: read: connection reset by peer" Mar 10 00:06:29 crc kubenswrapper[4906]: I0310 00:06:29.506915 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:29Z is after 2026-02-23T05:33:13Z Mar 10 00:06:29 crc kubenswrapper[4906]: I0310 00:06:29.538754 4906 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 10 00:06:29 crc kubenswrapper[4906]: I0310 00:06:29.538868 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 10 00:06:29 crc kubenswrapper[4906]: I0310 00:06:29.699089 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 00:06:29 crc kubenswrapper[4906]: I0310 00:06:29.701572 4906 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e49d2aa127c4c53cce4490ee27e0fe684c8e2dd90c4a65e07c4262b75bdc17e1" exitCode=255 Mar 10 00:06:29 crc kubenswrapper[4906]: I0310 00:06:29.701649 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e49d2aa127c4c53cce4490ee27e0fe684c8e2dd90c4a65e07c4262b75bdc17e1"} Mar 10 00:06:29 crc kubenswrapper[4906]: I0310 00:06:29.701846 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:29 crc kubenswrapper[4906]: I0310 00:06:29.702953 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:29 crc kubenswrapper[4906]: I0310 00:06:29.702988 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:29 crc kubenswrapper[4906]: I0310 00:06:29.702997 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:29 crc kubenswrapper[4906]: I0310 00:06:29.703559 4906 scope.go:117] "RemoveContainer" containerID="e49d2aa127c4c53cce4490ee27e0fe684c8e2dd90c4a65e07c4262b75bdc17e1" Mar 10 00:06:30 crc kubenswrapper[4906]: I0310 00:06:30.505230 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:30Z is after 2026-02-23T05:33:13Z Mar 10 00:06:30 crc kubenswrapper[4906]: I0310 00:06:30.706737 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 00:06:30 crc kubenswrapper[4906]: I0310 00:06:30.707193 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 00:06:30 crc kubenswrapper[4906]: I0310 00:06:30.709366 4906 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a3a330144f57a1bcf74019cd337dd1646e39a52f792bdf38f3026ef490640fda" exitCode=255 Mar 10 00:06:30 crc kubenswrapper[4906]: I0310 00:06:30.709418 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a3a330144f57a1bcf74019cd337dd1646e39a52f792bdf38f3026ef490640fda"} Mar 10 00:06:30 crc kubenswrapper[4906]: I0310 00:06:30.709494 4906 scope.go:117] "RemoveContainer" containerID="e49d2aa127c4c53cce4490ee27e0fe684c8e2dd90c4a65e07c4262b75bdc17e1" Mar 10 00:06:30 crc kubenswrapper[4906]: I0310 00:06:30.709675 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:30 crc kubenswrapper[4906]: I0310 00:06:30.710858 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:30 crc kubenswrapper[4906]: I0310 00:06:30.710901 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:30 crc kubenswrapper[4906]: I0310 00:06:30.710911 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:30 crc kubenswrapper[4906]: I0310 00:06:30.711525 4906 scope.go:117] "RemoveContainer" containerID="a3a330144f57a1bcf74019cd337dd1646e39a52f792bdf38f3026ef490640fda" Mar 10 00:06:30 crc kubenswrapper[4906]: E0310 00:06:30.711783 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:31 crc kubenswrapper[4906]: I0310 00:06:31.509207 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:31Z is after 2026-02-23T05:33:13Z Mar 10 00:06:31 crc kubenswrapper[4906]: I0310 00:06:31.715833 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 00:06:32 crc kubenswrapper[4906]: I0310 00:06:32.509351 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:32Z is after 2026-02-23T05:33:13Z Mar 10 00:06:32 crc kubenswrapper[4906]: W0310 00:06:32.758701 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:32Z is after 2026-02-23T05:33:13Z Mar 10 00:06:32 crc kubenswrapper[4906]: E0310 00:06:32.759239 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.133574 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.133781 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.135172 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.135219 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.135237 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.155393 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.508016 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:06:33Z is after 2026-02-23T05:33:13Z Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.725684 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.727094 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.727147 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.727168 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.867860 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.868085 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.869488 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.869547 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.869560 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.870376 4906 scope.go:117] "RemoveContainer" containerID="a3a330144f57a1bcf74019cd337dd1646e39a52f792bdf38f3026ef490640fda" Mar 10 00:06:33 crc kubenswrapper[4906]: E0310 00:06:33.870624 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:33 crc kubenswrapper[4906]: I0310 00:06:33.877413 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:34 crc kubenswrapper[4906]: I0310 00:06:34.051945 4906 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:06:34 crc kubenswrapper[4906]: I0310 00:06:34.052023 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:06:34 crc kubenswrapper[4906]: I0310 00:06:34.509797 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:34 crc kubenswrapper[4906]: E0310 00:06:34.674413 4906 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:06:34 crc kubenswrapper[4906]: I0310 00:06:34.728709 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:34 crc kubenswrapper[4906]: I0310 00:06:34.730326 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:34 crc kubenswrapper[4906]: I0310 00:06:34.730384 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:34 crc kubenswrapper[4906]: I0310 00:06:34.730396 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:34 crc kubenswrapper[4906]: I0310 00:06:34.731121 4906 scope.go:117] "RemoveContainer" containerID="a3a330144f57a1bcf74019cd337dd1646e39a52f792bdf38f3026ef490640fda" Mar 10 00:06:34 crc kubenswrapper[4906]: E0310 00:06:34.731409 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:34 crc kubenswrapper[4906]: I0310 00:06:34.979498 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:34 crc kubenswrapper[4906]: I0310 00:06:34.980878 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:34 crc kubenswrapper[4906]: I0310 00:06:34.980931 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:34 crc kubenswrapper[4906]: I0310 00:06:34.980943 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:34 crc kubenswrapper[4906]: I0310 00:06:34.981006 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:34 crc kubenswrapper[4906]: E0310 00:06:34.985887 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 00:06:34 crc kubenswrapper[4906]: E0310 00:06:34.986753 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 00:06:35 crc kubenswrapper[4906]: I0310 00:06:35.509096 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:36 crc kubenswrapper[4906]: W0310 00:06:36.395812 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:36 crc kubenswrapper[4906]: E0310 00:06:36.395922 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 00:06:36 crc kubenswrapper[4906]: I0310 00:06:36.511045 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:37 crc kubenswrapper[4906]: I0310 00:06:37.207897 4906 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 00:06:37 crc kubenswrapper[4906]: I0310 00:06:37.225893 4906 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 00:06:37 crc kubenswrapper[4906]: I0310 00:06:37.507528 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:37 crc kubenswrapper[4906]: W0310 00:06:37.862395 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 10 00:06:37 crc kubenswrapper[4906]: E0310 00:06:37.862457 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 00:06:38 crc kubenswrapper[4906]: I0310 00:06:38.508572 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.598137 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fb8fb8425 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.501377061 +0000 UTC m=+0.649272213,LastTimestamp:2026-03-10 00:06:14.501377061 +0000 UTC m=+0.649272213,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.603413 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4142c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547026988 +0000 UTC m=+0.694922120,LastTimestamp:2026-03-10 00:06:14.547026988 +0000 UTC m=+0.694922120,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.606881 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4c62f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547072559 +0000 UTC m=+0.694967681,LastTimestamp:2026-03-10 00:06:14.547072559 +0000 UTC m=+0.694967681,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.608105 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4fea1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547087009 +0000 UTC m=+0.694982131,LastTimestamp:2026-03-10 00:06:14.547087009 +0000 UTC m=+0.694982131,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.610728 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fc29cde78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.662946424 +0000 UTC m=+0.810841566,LastTimestamp:2026-03-10 00:06:14.662946424 +0000 UTC m=+0.810841566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.612339 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4142c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4142c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547026988 +0000 UTC m=+0.694922120,LastTimestamp:2026-03-10 00:06:14.677763771 +0000 UTC m=+0.825658923,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.617984 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4c62f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4c62f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547072559 +0000 UTC m=+0.694967681,LastTimestamp:2026-03-10 00:06:14.677803652 +0000 UTC m=+0.825698804,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.623828 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4fea1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4fea1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547087009 +0000 UTC m=+0.694982131,LastTimestamp:2026-03-10 00:06:14.677835453 +0000 UTC m=+0.825730605,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.630600 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4142c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4142c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547026988 +0000 UTC m=+0.694922120,LastTimestamp:2026-03-10 00:06:14.679743757 +0000 UTC m=+0.827638879,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.635743 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4c62f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4c62f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547072559 +0000 UTC m=+0.694967681,LastTimestamp:2026-03-10 00:06:14.679782888 +0000 UTC m=+0.827678010,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.641862 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4fea1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4fea1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547087009 +0000 UTC m=+0.694982131,LastTimestamp:2026-03-10 00:06:14.679794298 +0000 UTC m=+0.827689420,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.647836 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4142c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4142c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547026988 +0000 UTC m=+0.694922120,LastTimestamp:2026-03-10 00:06:14.679978133 +0000 UTC m=+0.827873275,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.652380 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4c62f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4c62f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547072559 +0000 UTC m=+0.694967681,LastTimestamp:2026-03-10 00:06:14.680039625 +0000 UTC m=+0.827934767,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.657100 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4fea1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4fea1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547087009 +0000 UTC m=+0.694982131,LastTimestamp:2026-03-10 00:06:14.680065616 +0000 UTC m=+0.827960758,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: I0310 00:06:38.663533 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:38 crc kubenswrapper[4906]: I0310 00:06:38.663900 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.663936 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4142c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4142c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547026988 +0000 UTC m=+0.694922120,LastTimestamp:2026-03-10 00:06:14.680990372 +0000 UTC m=+0.828885484,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: I0310 00:06:38.665436 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:38 crc kubenswrapper[4906]: I0310 00:06:38.665476 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:38 crc kubenswrapper[4906]: I0310 00:06:38.665492 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:38 crc kubenswrapper[4906]: I0310 00:06:38.666286 4906 scope.go:117] "RemoveContainer" containerID="a3a330144f57a1bcf74019cd337dd1646e39a52f792bdf38f3026ef490640fda" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.666589 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.669859 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4c62f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4c62f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547072559 +0000 UTC m=+0.694967681,LastTimestamp:2026-03-10 00:06:14.681006712 +0000 UTC m=+0.828901824,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.674495 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4fea1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4fea1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547087009 +0000 UTC m=+0.694982131,LastTimestamp:2026-03-10 00:06:14.681095205 +0000 UTC m=+0.828990307,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.679321 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4142c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4142c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547026988 +0000 UTC m=+0.694922120,LastTimestamp:2026-03-10 00:06:14.681113495 +0000 UTC m=+0.829008607,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.684056 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4c62f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4c62f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547072559 +0000 UTC m=+0.694967681,LastTimestamp:2026-03-10 00:06:14.681128436 +0000 UTC m=+0.829023548,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.689517 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4fea1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4fea1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547087009 +0000 UTC m=+0.694982131,LastTimestamp:2026-03-10 00:06:14.681138066 +0000 UTC m=+0.829033178,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.697846 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4142c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4142c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547026988 +0000 UTC m=+0.694922120,LastTimestamp:2026-03-10 00:06:14.682129864 +0000 UTC m=+0.830024976,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.702980 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4c62f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4c62f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547072559 +0000 UTC m=+0.694967681,LastTimestamp:2026-03-10 00:06:14.682146154 +0000 UTC m=+0.830041266,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.708451 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4fea1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4fea1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547087009 +0000 UTC m=+0.694982131,LastTimestamp:2026-03-10 00:06:14.682157805 +0000 UTC m=+0.830052917,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.713344 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4142c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4142c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547026988 +0000 UTC m=+0.694922120,LastTimestamp:2026-03-10 00:06:14.682190066 +0000 UTC m=+0.830085178,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.720115 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b521fbbb4c62f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b521fbbb4c62f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:14.547072559 +0000 UTC m=+0.694967681,LastTimestamp:2026-03-10 00:06:14.682198946 +0000 UTC m=+0.830094058,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.727132 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b521fdb6bfc67 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.079173223 +0000 UTC m=+1.227068335,LastTimestamp:2026-03-10 00:06:15.079173223 +0000 UTC m=+1.227068335,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.733022 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b521fdba4dee2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.082901218 +0000 UTC m=+1.230796330,LastTimestamp:2026-03-10 00:06:15.082901218 +0000 UTC m=+1.230796330,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.740063 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b521fdc8e0a23 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.098182179 +0000 UTC m=+1.246077291,LastTimestamp:2026-03-10 00:06:15.098182179 +0000 UTC m=+1.246077291,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.744163 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b521fde46a4ff openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.127057663 +0000 UTC m=+1.274952775,LastTimestamp:2026-03-10 00:06:15.127057663 +0000 UTC m=+1.274952775,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.748517 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b521fded78350 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.13655176 +0000 UTC m=+1.284446872,LastTimestamp:2026-03-10 00:06:15.13655176 +0000 UTC m=+1.284446872,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.753073 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b521ffdfa044d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.658906701 +0000 UTC m=+1.806801813,LastTimestamp:2026-03-10 00:06:15.658906701 +0000 UTC m=+1.806801813,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.757621 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b521ffe049917 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.659600151 +0000 UTC m=+1.807495263,LastTimestamp:2026-03-10 00:06:15.659600151 +0000 UTC m=+1.807495263,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.762058 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b521ffe0561fe openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.659651582 +0000 UTC m=+1.807546694,LastTimestamp:2026-03-10 00:06:15.659651582 +0000 UTC m=+1.807546694,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.766338 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b521ffe069ba3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.659731875 +0000 UTC m=+1.807626997,LastTimestamp:2026-03-10 00:06:15.659731875 +0000 UTC m=+1.807626997,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.769837 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b521ffe9a46ed openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.669409517 +0000 UTC m=+1.817304679,LastTimestamp:2026-03-10 00:06:15.669409517 +0000 UTC m=+1.817304679,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.773668 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b521ffeacddd0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.670627792 +0000 UTC m=+1.818522904,LastTimestamp:2026-03-10 00:06:15.670627792 +0000 UTC m=+1.818522904,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.777560 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b521ffebeb981 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.671798145 +0000 UTC m=+1.819693267,LastTimestamp:2026-03-10 00:06:15.671798145 +0000 UTC m=+1.819693267,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.781491 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b521ffedc49d7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.673735639 +0000 UTC m=+1.821630761,LastTimestamp:2026-03-10 00:06:15.673735639 +0000 UTC m=+1.821630761,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.784941 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b521ffef206f5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.675160309 +0000 UTC m=+1.823055411,LastTimestamp:2026-03-10 00:06:15.675160309 +0000 UTC m=+1.823055411,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.790977 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b521fff5169d0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.681411536 +0000 UTC m=+1.829306668,LastTimestamp:2026-03-10 00:06:15.681411536 +0000 UTC m=+1.829306668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.796690 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b521fffad8472 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.687447666 +0000 UTC m=+1.835342788,LastTimestamp:2026-03-10 00:06:15.687447666 +0000 UTC m=+1.835342788,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.800664 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b52200fd67822 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.958566946 +0000 UTC m=+2.106462068,LastTimestamp:2026-03-10 00:06:15.958566946 +0000 UTC m=+2.106462068,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.804504 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5220109a4628 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.971399208 +0000 UTC m=+2.119294310,LastTimestamp:2026-03-10 00:06:15.971399208 +0000 UTC m=+2.119294310,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.810580 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b522010ae7fe1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.972724705 +0000 UTC m=+2.120619827,LastTimestamp:2026-03-10 00:06:15.972724705 +0000 UTC m=+2.120619827,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.814463 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b52201c6626ef openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.169309935 +0000 UTC m=+2.317205047,LastTimestamp:2026-03-10 00:06:16.169309935 +0000 UTC m=+2.317205047,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.818228 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b52201d5df64d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.185550413 +0000 UTC m=+2.333445535,LastTimestamp:2026-03-10 00:06:16.185550413 +0000 UTC m=+2.333445535,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.821652 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b52201d80c8bb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.187832507 +0000 UTC m=+2.335727619,LastTimestamp:2026-03-10 00:06:16.187832507 +0000 UTC m=+2.335727619,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.825238 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b52202c11e1a5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.432222629 +0000 UTC m=+2.580117751,LastTimestamp:2026-03-10 00:06:16.432222629 +0000 UTC m=+2.580117751,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.828447 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b52202e624df4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.471047668 +0000 UTC m=+2.618942820,LastTimestamp:2026-03-10 00:06:16.471047668 +0000 UTC m=+2.618942820,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.834131 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b5220365744a6 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.604542118 +0000 UTC m=+2.752437230,LastTimestamp:2026-03-10 00:06:16.604542118 +0000 UTC m=+2.752437230,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.837471 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b5220366ec705 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.606082821 +0000 UTC m=+2.753977933,LastTimestamp:2026-03-10 00:06:16.606082821 +0000 UTC m=+2.753977933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.844049 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522037382336 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.619279158 +0000 UTC m=+2.767174280,LastTimestamp:2026-03-10 00:06:16.619279158 +0000 UTC m=+2.767174280,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.850560 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b522037a42d35 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.626359605 +0000 UTC m=+2.774254757,LastTimestamp:2026-03-10 00:06:16.626359605 +0000 UTC m=+2.774254757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.854601 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b5220474eccad openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.889199789 +0000 UTC m=+3.037094901,LastTimestamp:2026-03-10 00:06:16.889199789 +0000 UTC m=+3.037094901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.859606 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b5220475534b9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.889619641 +0000 UTC m=+3.037514753,LastTimestamp:2026-03-10 00:06:16.889619641 +0000 UTC m=+3.037514753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.863295 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220475d203d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.890138685 +0000 UTC m=+3.038033797,LastTimestamp:2026-03-10 00:06:16.890138685 +0000 UTC m=+3.038033797,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.867882 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52204776687d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.891795581 +0000 UTC m=+3.039690693,LastTimestamp:2026-03-10 00:06:16.891795581 +0000 UTC m=+3.039690693,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.872144 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b5220486f116a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.908091754 +0000 UTC m=+3.055986866,LastTimestamp:2026-03-10 00:06:16.908091754 +0000 UTC m=+3.055986866,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.876520 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b5220487fe2ce openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.909193934 +0000 UTC m=+3.057089046,LastTimestamp:2026-03-10 00:06:16.909193934 +0000 UTC m=+3.057089046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.881436 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b522048bd68e5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.913225957 +0000 UTC m=+3.061121069,LastTimestamp:2026-03-10 00:06:16.913225957 +0000 UTC m=+3.061121069,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.885990 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522048c1c798 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.913512344 +0000 UTC m=+3.061407446,LastTimestamp:2026-03-10 00:06:16.913512344 +0000 UTC m=+3.061407446,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.889303 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5220490b2ca2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:16.918322338 +0000 UTC m=+3.066217450,LastTimestamp:2026-03-10 00:06:16.918322338 +0000 UTC m=+3.066217450,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.892535 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b5220548c36ba openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.111328442 +0000 UTC m=+3.259223554,LastTimestamp:2026-03-10 00:06:17.111328442 +0000 UTC m=+3.259223554,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.898525 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5220559efb00 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.129335552 +0000 UTC m=+3.277230664,LastTimestamp:2026-03-10 00:06:17.129335552 +0000 UTC m=+3.277230664,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.902330 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b522055af9e25 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.130425893 +0000 UTC m=+3.278321015,LastTimestamp:2026-03-10 00:06:17.130425893 +0000 UTC m=+3.278321015,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.905881 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b522055c4dd79 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.131818361 +0000 UTC m=+3.279713473,LastTimestamp:2026-03-10 00:06:17.131818361 +0000 UTC m=+3.279713473,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.909964 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5220564e432f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.140822831 +0000 UTC m=+3.288717933,LastTimestamp:2026-03-10 00:06:17.140822831 +0000 UTC m=+3.288717933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.913543 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52205664ab48 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.142291272 +0000 UTC m=+3.290186384,LastTimestamp:2026-03-10 00:06:17.142291272 +0000 UTC m=+3.290186384,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.915544 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b5220609456ea openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.313187562 +0000 UTC m=+3.461082684,LastTimestamp:2026-03-10 00:06:17.313187562 +0000 UTC m=+3.461082684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.918962 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522060d74451 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.317573713 +0000 UTC m=+3.465468825,LastTimestamp:2026-03-10 00:06:17.317573713 +0000 UTC m=+3.465468825,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.922324 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b522061d31e74 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.334079092 +0000 UTC m=+3.481974204,LastTimestamp:2026-03-10 00:06:17.334079092 +0000 UTC m=+3.481974204,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.925810 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522061efed77 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.335967095 +0000 UTC m=+3.483862207,LastTimestamp:2026-03-10 00:06:17.335967095 +0000 UTC m=+3.483862207,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.928897 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522061fefee0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.336954592 +0000 UTC m=+3.484849704,LastTimestamp:2026-03-10 00:06:17.336954592 +0000 UTC m=+3.484849704,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.933077 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220670bc44f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.421677647 +0000 UTC m=+3.569572759,LastTimestamp:2026-03-10 00:06:17.421677647 +0000 UTC m=+3.569572759,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.936934 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52206b322be9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.491303401 +0000 UTC m=+3.639198513,LastTimestamp:2026-03-10 00:06:17.491303401 +0000 UTC m=+3.639198513,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.941552 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52206be5f2f3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.503085299 +0000 UTC m=+3.650980411,LastTimestamp:2026-03-10 00:06:17.503085299 +0000 UTC m=+3.650980411,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.945123 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52206bf928fa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.504344314 +0000 UTC m=+3.652239426,LastTimestamp:2026-03-10 00:06:17.504344314 +0000 UTC m=+3.652239426,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.950049 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220732053fb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.624351739 +0000 UTC m=+3.772246851,LastTimestamp:2026-03-10 00:06:17.624351739 +0000 UTC m=+3.772246851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.955441 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522076202ca5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.674673317 +0000 UTC m=+3.822568449,LastTimestamp:2026-03-10 00:06:17.674673317 +0000 UTC m=+3.822568449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.959751 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b522076d97e51 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.686818385 +0000 UTC m=+3.834713507,LastTimestamp:2026-03-10 00:06:17.686818385 +0000 UTC m=+3.834713507,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.963854 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b52207e923def openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.816366575 +0000 UTC m=+3.964261687,LastTimestamp:2026-03-10 00:06:17.816366575 +0000 UTC m=+3.964261687,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.969320 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b52207f2b12e3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:17.826382563 +0000 UTC m=+3.974277675,LastTimestamp:2026-03-10 00:06:17.826382563 +0000 UTC m=+3.974277675,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.975757 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220afc36253 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:18.641670739 +0000 UTC m=+4.789565851,LastTimestamp:2026-03-10 00:06:18.641670739 +0000 UTC m=+4.789565851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.980315 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220bdeaafca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:18.879127498 +0000 UTC m=+5.027022650,LastTimestamp:2026-03-10 00:06:18.879127498 +0000 UTC m=+5.027022650,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.983682 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220be77f770 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:18.888386416 +0000 UTC m=+5.036281528,LastTimestamp:2026-03-10 00:06:18.888386416 +0000 UTC m=+5.036281528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.989405 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220be87c432 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:18.889421874 +0000 UTC m=+5.037317026,LastTimestamp:2026-03-10 00:06:18.889421874 +0000 UTC m=+5.037317026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.993578 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220c83ec958 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:19.052411224 +0000 UTC m=+5.200306336,LastTimestamp:2026-03-10 00:06:19.052411224 +0000 UTC m=+5.200306336,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:38 crc kubenswrapper[4906]: E0310 00:06:38.997894 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220c8d32076 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:19.062132854 +0000 UTC m=+5.210027956,LastTimestamp:2026-03-10 00:06:19.062132854 +0000 UTC m=+5.210027956,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.002054 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220c8e465f3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:19.063264755 +0000 UTC m=+5.211159867,LastTimestamp:2026-03-10 00:06:19.063264755 +0000 UTC m=+5.211159867,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.007252 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220d4fa217a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:19.26601561 +0000 UTC m=+5.413910762,LastTimestamp:2026-03-10 00:06:19.26601561 +0000 UTC m=+5.413910762,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.011252 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220d5b6313c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:19.278340412 +0000 UTC m=+5.426235544,LastTimestamp:2026-03-10 00:06:19.278340412 +0000 UTC m=+5.426235544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.015155 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220d5c742ee openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:19.279459054 +0000 UTC m=+5.427354166,LastTimestamp:2026-03-10 00:06:19.279459054 +0000 UTC m=+5.427354166,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.019739 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220e2f9d015 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:19.500875797 +0000 UTC m=+5.648770919,LastTimestamp:2026-03-10 00:06:19.500875797 +0000 UTC m=+5.648770919,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.023702 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220e3a69b5f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:19.512200031 +0000 UTC m=+5.660095153,LastTimestamp:2026-03-10 00:06:19.512200031 +0000 UTC m=+5.660095153,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.027779 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220e3b7d09d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:19.513327773 +0000 UTC m=+5.661222905,LastTimestamp:2026-03-10 00:06:19.513327773 +0000 UTC m=+5.661222905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.032108 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220f0debae5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:19.733981925 +0000 UTC m=+5.881877047,LastTimestamp:2026-03-10 00:06:19.733981925 +0000 UTC m=+5.881877047,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.036189 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b5220f1a134cf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:19.746727119 +0000 UTC m=+5.894622231,LastTimestamp:2026-03-10 00:06:19.746727119 +0000 UTC m=+5.894622231,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.041909 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 00:06:39 crc kubenswrapper[4906]: &Event{ObjectMeta:{kube-controller-manager-crc.189b5221f22ac601 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 00:06:39 crc kubenswrapper[4906]: body: Mar 10 00:06:39 crc kubenswrapper[4906]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:24.050710017 +0000 UTC m=+10.198605209,LastTimestamp:2026-03-10 00:06:24.050710017 +0000 UTC m=+10.198605209,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:06:39 crc kubenswrapper[4906]: > Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.047043 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5221f22c9e03 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:24.050830851 +0000 UTC m=+10.198725993,LastTimestamp:2026-03-10 00:06:24.050830851 +0000 UTC m=+10.198725993,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.052720 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 00:06:39 crc kubenswrapper[4906]: &Event{ObjectMeta:{kube-apiserver-crc.189b5223009c9fcb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 00:06:39 crc kubenswrapper[4906]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 00:06:39 crc kubenswrapper[4906]: Mar 10 00:06:39 crc kubenswrapper[4906]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.588019659 +0000 UTC m=+14.735914781,LastTimestamp:2026-03-10 00:06:28.588019659 +0000 UTC m=+14.735914781,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:06:39 crc kubenswrapper[4906]: > Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.056301 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5223009d62c2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.58806957 +0000 UTC m=+14.735964702,LastTimestamp:2026-03-10 00:06:28.58806957 +0000 UTC m=+14.735964702,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.060288 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b5223009c9fcb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 00:06:39 crc kubenswrapper[4906]: &Event{ObjectMeta:{kube-apiserver-crc.189b5223009c9fcb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 00:06:39 crc kubenswrapper[4906]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 00:06:39 crc kubenswrapper[4906]: Mar 10 00:06:39 crc kubenswrapper[4906]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.588019659 +0000 UTC m=+14.735914781,LastTimestamp:2026-03-10 00:06:28.5952531 +0000 UTC m=+14.743148232,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:06:39 crc kubenswrapper[4906]: > Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.064570 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b5223009d62c2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b5223009d62c2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.58806957 +0000 UTC m=+14.735964702,LastTimestamp:2026-03-10 00:06:28.595290941 +0000 UTC m=+14.743186063,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.068607 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 00:06:39 crc kubenswrapper[4906]: &Event{ObjectMeta:{kube-apiserver-crc.189b522311577f9d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Mar 10 00:06:39 crc kubenswrapper[4906]: body: [+]ping ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]log ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]etcd ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/generic-apiserver-start-informers ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/priority-and-fairness-filter ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/start-apiextensions-informers ok Mar 10 00:06:39 crc kubenswrapper[4906]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 10 00:06:39 crc kubenswrapper[4906]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/start-system-namespaces-controller ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 10 00:06:39 crc kubenswrapper[4906]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 10 00:06:39 crc kubenswrapper[4906]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/bootstrap-controller ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/start-kube-aggregator-informers ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/apiservice-registration-controller ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/apiservice-discovery-controller ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]autoregister-completion ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/apiservice-openapi-controller ok Mar 10 00:06:39 crc kubenswrapper[4906]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 10 00:06:39 crc kubenswrapper[4906]: livez check failed Mar 10 00:06:39 crc kubenswrapper[4906]: Mar 10 00:06:39 crc kubenswrapper[4906]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.868702109 +0000 UTC m=+15.016597221,LastTimestamp:2026-03-10 00:06:28.868702109 +0000 UTC m=+15.016597221,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:06:39 crc kubenswrapper[4906]: > Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.072471 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b52231158591f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:28.868757791 +0000 UTC m=+15.016652903,LastTimestamp:2026-03-10 00:06:28.868757791 +0000 UTC m=+15.016652903,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.077416 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 00:06:39 crc kubenswrapper[4906]: &Event{ObjectMeta:{kube-apiserver-crc.189b522323ee8fae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:51976->192.168.126.11:17697: read: connection reset by peer Mar 10 00:06:39 crc kubenswrapper[4906]: body: Mar 10 00:06:39 crc kubenswrapper[4906]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:29.180592046 +0000 UTC m=+15.328487158,LastTimestamp:2026-03-10 00:06:29.180592046 +0000 UTC m=+15.328487158,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:06:39 crc kubenswrapper[4906]: > Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.082940 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b5221f22ac601\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 00:06:39 crc kubenswrapper[4906]: &Event{ObjectMeta:{kube-controller-manager-crc.189b5221f22ac601 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 00:06:39 crc kubenswrapper[4906]: body: Mar 10 00:06:39 crc kubenswrapper[4906]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:24.050710017 +0000 UTC m=+10.198605209,LastTimestamp:2026-03-10 00:06:34.052003553 +0000 UTC m=+20.199898665,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:06:39 crc kubenswrapper[4906]: > Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.087101 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b5221f22c9e03\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5221f22c9e03 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:24.050830851 +0000 UTC m=+10.198725993,LastTimestamp:2026-03-10 00:06:34.052057964 +0000 UTC m=+20.199953076,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:39 crc kubenswrapper[4906]: I0310 00:06:39.509544 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:39 crc kubenswrapper[4906]: I0310 00:06:39.537958 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:39 crc kubenswrapper[4906]: I0310 00:06:39.538204 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:39 crc kubenswrapper[4906]: I0310 00:06:39.539717 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:39 crc kubenswrapper[4906]: I0310 00:06:39.539777 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:39 crc kubenswrapper[4906]: I0310 00:06:39.539796 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:39 crc kubenswrapper[4906]: I0310 00:06:39.540686 4906 scope.go:117] "RemoveContainer" containerID="a3a330144f57a1bcf74019cd337dd1646e39a52f792bdf38f3026ef490640fda" Mar 10 00:06:39 crc kubenswrapper[4906]: E0310 00:06:39.540991 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:40 crc kubenswrapper[4906]: W0310 00:06:40.157529 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 10 00:06:40 crc kubenswrapper[4906]: E0310 00:06:40.157598 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 00:06:40 crc kubenswrapper[4906]: I0310 00:06:40.510489 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:41 crc kubenswrapper[4906]: I0310 00:06:41.506230 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:41 crc kubenswrapper[4906]: I0310 00:06:41.986845 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:41 crc kubenswrapper[4906]: I0310 00:06:41.988493 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:41 crc kubenswrapper[4906]: I0310 00:06:41.988564 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:41 crc kubenswrapper[4906]: I0310 00:06:41.988592 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:41 crc kubenswrapper[4906]: I0310 00:06:41.988674 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:41 crc kubenswrapper[4906]: E0310 00:06:41.993690 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 00:06:41 crc kubenswrapper[4906]: E0310 00:06:41.994358 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 00:06:42 crc kubenswrapper[4906]: I0310 00:06:42.507096 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:43 crc kubenswrapper[4906]: I0310 00:06:43.509880 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.051520 4906 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.051623 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.051720 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.051933 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.053546 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.053591 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.053604 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.054298 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.054544 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe" gracePeriod=30 Mar 10 00:06:44 crc kubenswrapper[4906]: E0310 00:06:44.058600 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b5221f22ac601\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 00:06:44 crc kubenswrapper[4906]: &Event{ObjectMeta:{kube-controller-manager-crc.189b5221f22ac601 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 00:06:44 crc kubenswrapper[4906]: body: Mar 10 00:06:44 crc kubenswrapper[4906]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:24.050710017 +0000 UTC m=+10.198605209,LastTimestamp:2026-03-10 00:06:44.051589849 +0000 UTC m=+30.199484971,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:06:44 crc kubenswrapper[4906]: > Mar 10 00:06:44 crc kubenswrapper[4906]: E0310 00:06:44.065967 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b5221f22c9e03\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5221f22c9e03 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:24.050830851 +0000 UTC m=+10.198725993,LastTimestamp:2026-03-10 00:06:44.051686972 +0000 UTC m=+30.199582094,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:44 crc kubenswrapper[4906]: E0310 00:06:44.072224 4906 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b52269a7cac25 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:44.054518821 +0000 UTC m=+30.202413943,LastTimestamp:2026-03-10 00:06:44.054518821 +0000 UTC m=+30.202413943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:44 crc kubenswrapper[4906]: E0310 00:06:44.183033 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b521ffef206f5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b521ffef206f5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.675160309 +0000 UTC m=+1.823055411,LastTimestamp:2026-03-10 00:06:44.17937534 +0000 UTC m=+30.327270482,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:44 crc kubenswrapper[4906]: W0310 00:06:44.361945 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 10 00:06:44 crc kubenswrapper[4906]: E0310 00:06:44.362033 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 00:06:44 crc kubenswrapper[4906]: E0310 00:06:44.423661 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b52200fd67822\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b52200fd67822 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.958566946 +0000 UTC m=+2.106462068,LastTimestamp:2026-03-10 00:06:44.414472864 +0000 UTC m=+30.562368006,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:44 crc kubenswrapper[4906]: E0310 00:06:44.436365 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b5220109a4628\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5220109a4628 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:15.971399208 +0000 UTC m=+2.119294310,LastTimestamp:2026-03-10 00:06:44.428106963 +0000 UTC m=+30.576002085,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.508936 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:44 crc kubenswrapper[4906]: E0310 00:06:44.675591 4906 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.761880 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.762547 4906 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe" exitCode=255 Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.762664 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe"} Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.762731 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dab4356524a9cb048a66600ffecef81aa5f9f442b722d6780409970b0fe775d6"} Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.762904 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.764356 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.764419 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:44 crc kubenswrapper[4906]: I0310 00:06:44.764437 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:45 crc kubenswrapper[4906]: I0310 00:06:45.508719 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:46 crc kubenswrapper[4906]: I0310 00:06:46.510853 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:47 crc kubenswrapper[4906]: I0310 00:06:47.508341 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:48 crc kubenswrapper[4906]: I0310 00:06:48.510358 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:48 crc kubenswrapper[4906]: I0310 00:06:48.994059 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:48 crc kubenswrapper[4906]: I0310 00:06:48.995940 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:48 crc kubenswrapper[4906]: I0310 00:06:48.995995 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:48 crc kubenswrapper[4906]: I0310 00:06:48.996010 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:48 crc kubenswrapper[4906]: I0310 00:06:48.996051 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:49 crc kubenswrapper[4906]: E0310 00:06:49.001028 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 00:06:49 crc kubenswrapper[4906]: E0310 00:06:49.004712 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 00:06:49 crc kubenswrapper[4906]: I0310 00:06:49.510351 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:49 crc kubenswrapper[4906]: I0310 00:06:49.704715 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:49 crc kubenswrapper[4906]: I0310 00:06:49.704927 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:49 crc kubenswrapper[4906]: I0310 00:06:49.706084 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:49 crc kubenswrapper[4906]: I0310 00:06:49.706128 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:49 crc kubenswrapper[4906]: I0310 00:06:49.706142 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:50 crc kubenswrapper[4906]: I0310 00:06:50.509879 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:51 crc kubenswrapper[4906]: I0310 00:06:51.050598 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:06:51 crc kubenswrapper[4906]: I0310 00:06:51.050769 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:51 crc kubenswrapper[4906]: I0310 00:06:51.052084 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:51 crc kubenswrapper[4906]: I0310 00:06:51.052150 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:51 crc kubenswrapper[4906]: I0310 00:06:51.052164 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:51 crc kubenswrapper[4906]: I0310 00:06:51.509813 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:52 crc kubenswrapper[4906]: I0310 00:06:52.508788 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:53 crc kubenswrapper[4906]: I0310 00:06:53.509305 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:53 crc kubenswrapper[4906]: I0310 00:06:53.576146 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:53 crc kubenswrapper[4906]: I0310 00:06:53.577381 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:53 crc kubenswrapper[4906]: I0310 00:06:53.577414 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:53 crc kubenswrapper[4906]: I0310 00:06:53.577425 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:53 crc kubenswrapper[4906]: I0310 00:06:53.578398 4906 scope.go:117] "RemoveContainer" containerID="a3a330144f57a1bcf74019cd337dd1646e39a52f792bdf38f3026ef490640fda" Mar 10 00:06:54 crc kubenswrapper[4906]: W0310 00:06:54.017100 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:54 crc kubenswrapper[4906]: E0310 00:06:54.017188 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 00:06:54 crc kubenswrapper[4906]: I0310 00:06:54.050953 4906 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:06:54 crc kubenswrapper[4906]: I0310 00:06:54.051085 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:06:54 crc kubenswrapper[4906]: E0310 00:06:54.056442 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b5221f22ac601\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 00:06:54 crc kubenswrapper[4906]: &Event{ObjectMeta:{kube-controller-manager-crc.189b5221f22ac601 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 00:06:54 crc kubenswrapper[4906]: body: Mar 10 00:06:54 crc kubenswrapper[4906]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:24.050710017 +0000 UTC m=+10.198605209,LastTimestamp:2026-03-10 00:06:54.051047443 +0000 UTC m=+40.198942595,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 00:06:54 crc kubenswrapper[4906]: > Mar 10 00:06:54 crc kubenswrapper[4906]: E0310 00:06:54.061130 4906 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b5221f22c9e03\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b5221f22c9e03 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:06:24.050830851 +0000 UTC m=+10.198725993,LastTimestamp:2026-03-10 00:06:54.051126466 +0000 UTC m=+40.199021618,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:06:54 crc kubenswrapper[4906]: I0310 00:06:54.505466 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:54 crc kubenswrapper[4906]: E0310 00:06:54.676539 4906 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:06:54 crc kubenswrapper[4906]: I0310 00:06:54.794394 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 00:06:54 crc kubenswrapper[4906]: I0310 00:06:54.795621 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 00:06:54 crc kubenswrapper[4906]: I0310 00:06:54.798059 4906 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30" exitCode=255 Mar 10 00:06:54 crc kubenswrapper[4906]: I0310 00:06:54.798146 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30"} Mar 10 00:06:54 crc kubenswrapper[4906]: I0310 00:06:54.798260 4906 scope.go:117] "RemoveContainer" containerID="a3a330144f57a1bcf74019cd337dd1646e39a52f792bdf38f3026ef490640fda" Mar 10 00:06:54 crc kubenswrapper[4906]: I0310 00:06:54.798424 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:54 crc kubenswrapper[4906]: I0310 00:06:54.800677 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:54 crc kubenswrapper[4906]: I0310 00:06:54.800715 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:54 crc kubenswrapper[4906]: I0310 00:06:54.800724 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:54 crc kubenswrapper[4906]: I0310 00:06:54.801440 4906 scope.go:117] "RemoveContainer" containerID="c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30" Mar 10 00:06:54 crc kubenswrapper[4906]: E0310 00:06:54.801671 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:55 crc kubenswrapper[4906]: I0310 00:06:55.510480 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:55 crc kubenswrapper[4906]: I0310 00:06:55.802942 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 00:06:56 crc kubenswrapper[4906]: I0310 00:06:56.001352 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:56 crc kubenswrapper[4906]: I0310 00:06:56.002789 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:56 crc kubenswrapper[4906]: I0310 00:06:56.002837 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:56 crc kubenswrapper[4906]: I0310 00:06:56.002854 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:56 crc kubenswrapper[4906]: I0310 00:06:56.002884 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:06:56 crc kubenswrapper[4906]: E0310 00:06:56.004271 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 00:06:56 crc kubenswrapper[4906]: E0310 00:06:56.011930 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 00:06:56 crc kubenswrapper[4906]: I0310 00:06:56.509482 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:57 crc kubenswrapper[4906]: I0310 00:06:57.507857 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:58 crc kubenswrapper[4906]: W0310 00:06:58.491358 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 10 00:06:58 crc kubenswrapper[4906]: E0310 00:06:58.491444 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 00:06:58 crc kubenswrapper[4906]: I0310 00:06:58.509222 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:58 crc kubenswrapper[4906]: I0310 00:06:58.664093 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:58 crc kubenswrapper[4906]: I0310 00:06:58.664408 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:58 crc kubenswrapper[4906]: I0310 00:06:58.666264 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:58 crc kubenswrapper[4906]: I0310 00:06:58.666316 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:58 crc kubenswrapper[4906]: I0310 00:06:58.666333 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:58 crc kubenswrapper[4906]: I0310 00:06:58.667191 4906 scope.go:117] "RemoveContainer" containerID="c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30" Mar 10 00:06:58 crc kubenswrapper[4906]: E0310 00:06:58.667579 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:59 crc kubenswrapper[4906]: I0310 00:06:59.507162 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:06:59 crc kubenswrapper[4906]: I0310 00:06:59.537980 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:06:59 crc kubenswrapper[4906]: I0310 00:06:59.538174 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:06:59 crc kubenswrapper[4906]: I0310 00:06:59.540029 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:06:59 crc kubenswrapper[4906]: I0310 00:06:59.540077 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:06:59 crc kubenswrapper[4906]: I0310 00:06:59.540088 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:06:59 crc kubenswrapper[4906]: I0310 00:06:59.540734 4906 scope.go:117] "RemoveContainer" containerID="c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30" Mar 10 00:06:59 crc kubenswrapper[4906]: E0310 00:06:59.540913 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:06:59 crc kubenswrapper[4906]: W0310 00:06:59.986881 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 10 00:06:59 crc kubenswrapper[4906]: E0310 00:06:59.986955 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 10 00:07:00 crc kubenswrapper[4906]: I0310 00:07:00.507490 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:01 crc kubenswrapper[4906]: W0310 00:07:01.073717 4906 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 10 00:07:01 crc kubenswrapper[4906]: E0310 00:07:01.073797 4906 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 10 00:07:01 crc kubenswrapper[4906]: I0310 00:07:01.506905 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:02 crc kubenswrapper[4906]: I0310 00:07:02.510444 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:02 crc kubenswrapper[4906]: I0310 00:07:02.785480 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 00:07:02 crc kubenswrapper[4906]: I0310 00:07:02.785771 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:02 crc kubenswrapper[4906]: I0310 00:07:02.787814 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:02 crc kubenswrapper[4906]: I0310 00:07:02.787876 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:02 crc kubenswrapper[4906]: I0310 00:07:02.787895 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:03 crc kubenswrapper[4906]: I0310 00:07:03.005059 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:03 crc kubenswrapper[4906]: I0310 00:07:03.006695 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:03 crc kubenswrapper[4906]: I0310 00:07:03.006800 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:03 crc kubenswrapper[4906]: I0310 00:07:03.006825 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:03 crc kubenswrapper[4906]: I0310 00:07:03.006867 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:07:03 crc kubenswrapper[4906]: E0310 00:07:03.012546 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 00:07:03 crc kubenswrapper[4906]: E0310 00:07:03.016377 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 00:07:03 crc kubenswrapper[4906]: I0310 00:07:03.506709 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:03 crc kubenswrapper[4906]: I0310 00:07:03.854486 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:07:03 crc kubenswrapper[4906]: I0310 00:07:03.854839 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:03 crc kubenswrapper[4906]: I0310 00:07:03.856309 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:03 crc kubenswrapper[4906]: I0310 00:07:03.856368 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:03 crc kubenswrapper[4906]: I0310 00:07:03.856393 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:03 crc kubenswrapper[4906]: I0310 00:07:03.863456 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:07:04 crc kubenswrapper[4906]: I0310 00:07:04.508770 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:04 crc kubenswrapper[4906]: E0310 00:07:04.676931 4906 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:07:04 crc kubenswrapper[4906]: I0310 00:07:04.830138 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:04 crc kubenswrapper[4906]: I0310 00:07:04.831014 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:04 crc kubenswrapper[4906]: I0310 00:07:04.831058 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:04 crc kubenswrapper[4906]: I0310 00:07:04.831070 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:05 crc kubenswrapper[4906]: I0310 00:07:05.507232 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:06 crc kubenswrapper[4906]: I0310 00:07:06.511224 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:07 crc kubenswrapper[4906]: I0310 00:07:07.509005 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:08 crc kubenswrapper[4906]: I0310 00:07:08.510140 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:09 crc kubenswrapper[4906]: I0310 00:07:09.506559 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:10 crc kubenswrapper[4906]: I0310 00:07:10.013316 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:10 crc kubenswrapper[4906]: I0310 00:07:10.015014 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:10 crc kubenswrapper[4906]: I0310 00:07:10.015082 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:10 crc kubenswrapper[4906]: I0310 00:07:10.015101 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:10 crc kubenswrapper[4906]: I0310 00:07:10.015145 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:07:10 crc kubenswrapper[4906]: E0310 00:07:10.018096 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 00:07:10 crc kubenswrapper[4906]: E0310 00:07:10.018135 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 00:07:10 crc kubenswrapper[4906]: I0310 00:07:10.508219 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:11 crc kubenswrapper[4906]: I0310 00:07:11.506600 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:12 crc kubenswrapper[4906]: I0310 00:07:12.509225 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:13 crc kubenswrapper[4906]: I0310 00:07:13.509393 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:13 crc kubenswrapper[4906]: I0310 00:07:13.576415 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:13 crc kubenswrapper[4906]: I0310 00:07:13.577973 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:13 crc kubenswrapper[4906]: I0310 00:07:13.578013 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:13 crc kubenswrapper[4906]: I0310 00:07:13.578059 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:13 crc kubenswrapper[4906]: I0310 00:07:13.578745 4906 scope.go:117] "RemoveContainer" containerID="c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30" Mar 10 00:07:13 crc kubenswrapper[4906]: E0310 00:07:13.578963 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 00:07:14 crc kubenswrapper[4906]: I0310 00:07:14.509810 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:14 crc kubenswrapper[4906]: E0310 00:07:14.677041 4906 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 00:07:15 crc kubenswrapper[4906]: I0310 00:07:15.511606 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:16 crc kubenswrapper[4906]: I0310 00:07:16.510655 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:17 crc kubenswrapper[4906]: I0310 00:07:17.019365 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:17 crc kubenswrapper[4906]: I0310 00:07:17.022850 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:17 crc kubenswrapper[4906]: I0310 00:07:17.022956 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:17 crc kubenswrapper[4906]: I0310 00:07:17.022978 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:17 crc kubenswrapper[4906]: I0310 00:07:17.023025 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:07:17 crc kubenswrapper[4906]: E0310 00:07:17.029966 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 00:07:17 crc kubenswrapper[4906]: E0310 00:07:17.030020 4906 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 00:07:17 crc kubenswrapper[4906]: I0310 00:07:17.509658 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:18 crc kubenswrapper[4906]: I0310 00:07:18.510314 4906 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 00:07:19 crc kubenswrapper[4906]: I0310 00:07:19.021212 4906 csr.go:261] certificate signing request csr-bn96v is approved, waiting to be issued Mar 10 00:07:19 crc kubenswrapper[4906]: I0310 00:07:19.031892 4906 csr.go:257] certificate signing request csr-bn96v is issued Mar 10 00:07:19 crc kubenswrapper[4906]: I0310 00:07:19.065191 4906 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 10 00:07:19 crc kubenswrapper[4906]: I0310 00:07:19.339524 4906 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 10 00:07:20 crc kubenswrapper[4906]: I0310 00:07:20.033518 4906 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-17 07:34:49.141117624 +0000 UTC Mar 10 00:07:20 crc kubenswrapper[4906]: I0310 00:07:20.033577 4906 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6775h27m29.107543968s for next certificate rotation Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.030204 4906 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.033139 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.033698 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.033722 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.033996 4906 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.043800 4906 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.044272 4906 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.044307 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.051062 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.051095 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.051107 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.051128 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.051143 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:24Z","lastTransitionTime":"2026-03-10T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.065485 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.076271 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.076343 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.076376 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.076404 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.076426 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:24Z","lastTransitionTime":"2026-03-10T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.087746 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.095607 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.095681 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.095695 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.095718 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.095731 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:24Z","lastTransitionTime":"2026-03-10T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.107625 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.114439 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.114477 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.114489 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.114507 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.114520 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:24Z","lastTransitionTime":"2026-03-10T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.134045 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.135589 4906 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.135764 4906 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.236979 4906 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.338104 4906 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.438921 4906 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.480601 4906 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.527954 4906 apiserver.go:52] "Watching apiserver" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.532551 4906 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.532983 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.533605 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.533598 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.533702 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.533727 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.534574 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.534619 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.534821 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.534922 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.535013 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.537713 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.537991 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.540083 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.540669 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.540970 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.541008 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.541134 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.540977 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.541261 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.542235 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.542410 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.542536 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.542731 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.542928 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:24Z","lastTransitionTime":"2026-03-10T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.560228 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.572469 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.584553 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.590961 4906 scope.go:117] "RemoveContainer" containerID="c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.593188 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.599897 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.606504 4906 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.613272 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.627613 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.642043 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.646355 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.646389 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.646402 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.646428 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.646442 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:24Z","lastTransitionTime":"2026-03-10T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.661858 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.673583 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.686256 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.697512 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.701278 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.701363 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.701971 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702129 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702183 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702237 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702276 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702311 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702343 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702385 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702425 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702468 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702508 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702544 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702582 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702618 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702717 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702753 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702610 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.703014 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702933 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.702979 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.703087 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.703528 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.703584 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.703619 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.703699 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.704131 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.704525 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.704555 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.704586 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.704624 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.704694 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.704731 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.704807 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.704897 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.705244 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.705486 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:07:25.205443537 +0000 UTC m=+71.353338669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.705592 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.705601 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.705770 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.705874 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.705933 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706108 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706155 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706365 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706399 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706418 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706457 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706504 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706540 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706575 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706614 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706751 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706774 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706800 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706841 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706877 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706913 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706952 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.706988 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.707027 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.707054 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.707067 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.707108 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.707150 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.707188 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.707205 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.707221 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.707261 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.707365 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.707371 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.707540 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.707575 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.707822 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.707942 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708153 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708251 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708288 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708321 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708355 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708387 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708419 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708435 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708450 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708538 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708619 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708632 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708700 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708717 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708756 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708804 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708811 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708847 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708884 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708919 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708955 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.708997 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709021 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709060 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709069 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709081 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709097 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709134 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709169 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709202 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709233 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709268 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709286 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709321 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709359 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709471 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709612 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709672 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709706 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709738 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709774 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709843 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709887 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709931 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709967 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710000 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710036 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710069 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710105 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710142 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710174 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710206 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710239 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710273 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710306 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710337 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710372 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710404 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710441 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710507 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710543 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710575 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710700 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710740 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710774 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710805 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710838 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710866 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710904 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710935 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710969 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711001 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711031 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711068 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711103 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711136 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711166 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711200 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711232 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711273 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711304 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711336 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711371 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711403 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711435 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711469 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711501 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711535 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711568 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711600 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711665 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711705 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711742 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711775 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711812 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711846 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711875 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711912 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711947 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711981 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712014 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712047 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712081 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712113 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712146 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712176 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712211 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712242 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712274 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712304 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712339 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712370 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712403 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712470 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712539 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712582 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712617 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712838 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712930 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712966 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713003 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713039 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713072 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713109 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713156 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713193 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713230 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713262 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713292 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713320 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713353 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713386 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713422 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713459 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713491 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713522 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713558 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713714 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713753 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713820 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713855 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713953 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713989 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714023 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714057 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714120 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714163 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714199 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714419 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714492 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714530 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714709 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714755 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714792 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714826 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714858 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714893 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714925 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714993 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715040 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715076 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715111 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715155 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715194 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715226 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715266 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715306 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715343 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715375 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715416 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715451 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715486 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715557 4906 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715585 4906 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715604 4906 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715623 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715671 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715693 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715711 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715728 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715747 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715763 4906 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715781 4906 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715800 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715821 4906 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715839 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715859 4906 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715878 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715895 4906 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715915 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715931 4906 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715949 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715967 4906 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715984 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716005 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716024 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716043 4906 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716065 4906 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716082 4906 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716100 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716117 4906 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716135 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716154 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716172 4906 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716189 4906 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716206 4906 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716268 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716309 4906 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716328 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716349 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716849 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709468 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.728192 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.728216 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709535 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709521 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710178 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710542 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.710882 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711162 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711413 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.728676 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711588 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711528 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711794 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.711997 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712258 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712455 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712487 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712590 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.712891 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713208 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713362 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713369 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713687 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713803 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714062 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.713989 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714962 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.714814 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.715944 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716321 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716693 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.729138 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.716906 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.717313 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.717344 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.717573 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.717760 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.717815 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.717961 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.718043 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.718109 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.718487 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.729311 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.719496 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.719602 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.720471 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.720546 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.720701 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.720737 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.722015 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.729424 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.722079 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.722265 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.723124 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.726950 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.726978 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.727272 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.727308 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.727296 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.727552 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.727700 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.727727 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.727738 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.727760 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.709996 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.728775 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.720193 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.729834 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.730216 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.730100 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.730394 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.730787 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.731150 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.731742 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.731823 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.732784 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.733032 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.733164 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.733272 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.734985 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.736466 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.737173 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.737139 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.737423 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.737798 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.738025 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.738190 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.738306 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.739714 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.739881 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.739925 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.740077 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.740177 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.740353 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.737876 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.740731 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.746574 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.741107 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.741212 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.741356 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.742584 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.741425 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.741438 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.741545 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.741705 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.741714 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.741980 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.742048 4906 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.746996 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:25.246965147 +0000 UTC m=+71.394860279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.747260 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.746526 4906 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.748168 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.748789 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.742281 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.742700 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.743091 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.743222 4906 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.751263 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:25.251229632 +0000 UTC m=+71.399124744 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.743256 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.743370 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.743646 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.743957 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.743974 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.744128 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.744195 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.744290 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.744593 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.745005 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.745441 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.745518 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.745842 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.745950 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.745995 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.746044 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.746063 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.751494 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.751514 4906 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.746686 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.751552 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:25.251544431 +0000 UTC m=+71.399439543 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.753545 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.756944 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.756980 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.756998 4906 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.757084 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:25.257067103 +0000 UTC m=+71.404962225 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.758122 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.758158 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.758172 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.758202 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.758217 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:24Z","lastTransitionTime":"2026-03-10T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.760875 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.761071 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.761670 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.761964 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.762173 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.767845 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.768196 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.768417 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.768292 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.768652 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.768949 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.768749 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.770247 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.770251 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.770765 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.770790 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.770965 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.771129 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.771378 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.756658 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.777116 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.779812 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.781576 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.781616 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.781626 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.781765 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.781778 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.781799 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.783758 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.784296 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.799619 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.806845 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.810181 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.813904 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.816732 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.816770 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.816835 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.816849 4906 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.816861 4906 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.816871 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.816882 4906 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.816894 4906 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.816897 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.816904 4906 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.816945 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.816960 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.816975 4906 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.816989 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817003 4906 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817014 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817024 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817035 4906 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817046 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817058 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817070 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817082 4906 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817095 4906 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817109 4906 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817122 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817167 4906 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817178 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817189 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817199 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817210 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817222 4906 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817233 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817244 4906 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817256 4906 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817268 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817279 4906 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817289 4906 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817300 4906 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817310 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817321 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817332 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817345 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817356 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817366 4906 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817377 4906 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817387 4906 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817399 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817409 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817419 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817430 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817441 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817450 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817461 4906 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817473 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817484 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817496 4906 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817508 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817519 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817529 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817544 4906 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817556 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817569 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817579 4906 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817591 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817603 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817613 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817623 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817648 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817662 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817671 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817682 4906 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817693 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817705 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817716 4906 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817726 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817736 4906 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817749 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817761 4906 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817772 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817781 4906 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817793 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817803 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817816 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817827 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817838 4906 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817849 4906 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817860 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817870 4906 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817881 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817893 4906 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817903 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817913 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817924 4906 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817937 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817948 4906 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817958 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817968 4906 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817979 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817989 4906 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.817999 4906 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818009 4906 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818020 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818032 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818043 4906 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818053 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818064 4906 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818074 4906 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818084 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818096 4906 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818106 4906 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818118 4906 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818128 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818138 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818150 4906 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818161 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818171 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818181 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818192 4906 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818201 4906 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818212 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818223 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818233 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818244 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818253 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818264 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818276 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818287 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818298 4906 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818308 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818318 4906 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818328 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818340 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818357 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818368 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818379 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818390 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818400 4906 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818410 4906 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818421 4906 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818431 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818442 4906 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818451 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818463 4906 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818473 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818484 4906 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818497 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818508 4906 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818518 4906 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818528 4906 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818539 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818550 4906 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818560 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818570 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818581 4906 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818592 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818603 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818616 4906 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818628 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818653 4906 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.818663 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.854831 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.860536 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.860569 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.860579 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.860612 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.860624 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:24Z","lastTransitionTime":"2026-03-10T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.862391 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.868147 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.874999 4906 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 00:07:24 crc kubenswrapper[4906]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 00:07:24 crc kubenswrapper[4906]: set -o allexport Mar 10 00:07:24 crc kubenswrapper[4906]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 00:07:24 crc kubenswrapper[4906]: source /etc/kubernetes/apiserver-url.env Mar 10 00:07:24 crc kubenswrapper[4906]: else Mar 10 00:07:24 crc kubenswrapper[4906]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 00:07:24 crc kubenswrapper[4906]: exit 1 Mar 10 00:07:24 crc kubenswrapper[4906]: fi Mar 10 00:07:24 crc kubenswrapper[4906]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 00:07:24 crc kubenswrapper[4906]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 00:07:24 crc kubenswrapper[4906]: > logger="UnhandledError" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.876394 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.878009 4906 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 00:07:24 crc kubenswrapper[4906]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 00:07:24 crc kubenswrapper[4906]: if [[ -f "/env/_master" ]]; then Mar 10 00:07:24 crc kubenswrapper[4906]: set -o allexport Mar 10 00:07:24 crc kubenswrapper[4906]: source "/env/_master" Mar 10 00:07:24 crc kubenswrapper[4906]: set +o allexport Mar 10 00:07:24 crc kubenswrapper[4906]: fi Mar 10 00:07:24 crc kubenswrapper[4906]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 00:07:24 crc kubenswrapper[4906]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 00:07:24 crc kubenswrapper[4906]: ho_enable="--enable-hybrid-overlay" Mar 10 00:07:24 crc kubenswrapper[4906]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 00:07:24 crc kubenswrapper[4906]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 00:07:24 crc kubenswrapper[4906]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 00:07:24 crc kubenswrapper[4906]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 00:07:24 crc kubenswrapper[4906]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 00:07:24 crc kubenswrapper[4906]: --webhook-host=127.0.0.1 \ Mar 10 00:07:24 crc kubenswrapper[4906]: --webhook-port=9743 \ Mar 10 00:07:24 crc kubenswrapper[4906]: ${ho_enable} \ Mar 10 00:07:24 crc kubenswrapper[4906]: --enable-interconnect \ Mar 10 00:07:24 crc kubenswrapper[4906]: --disable-approver \ Mar 10 00:07:24 crc kubenswrapper[4906]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 00:07:24 crc kubenswrapper[4906]: --wait-for-kubernetes-api=200s \ Mar 10 00:07:24 crc kubenswrapper[4906]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 00:07:24 crc kubenswrapper[4906]: --loglevel="${LOGLEVEL}" Mar 10 00:07:24 crc kubenswrapper[4906]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 00:07:24 crc kubenswrapper[4906]: > logger="UnhandledError" Mar 10 00:07:24 crc kubenswrapper[4906]: W0310 00:07:24.883444 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-d47500314598b54c196aec63bc5eae1917c20d2de347303e967f6046ce440a3f WatchSource:0}: Error finding container d47500314598b54c196aec63bc5eae1917c20d2de347303e967f6046ce440a3f: Status 404 returned error can't find the container with id d47500314598b54c196aec63bc5eae1917c20d2de347303e967f6046ce440a3f Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.883734 4906 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 00:07:24 crc kubenswrapper[4906]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 00:07:24 crc kubenswrapper[4906]: if [[ -f "/env/_master" ]]; then Mar 10 00:07:24 crc kubenswrapper[4906]: set -o allexport Mar 10 00:07:24 crc kubenswrapper[4906]: source "/env/_master" Mar 10 00:07:24 crc kubenswrapper[4906]: set +o allexport Mar 10 00:07:24 crc kubenswrapper[4906]: fi Mar 10 00:07:24 crc kubenswrapper[4906]: Mar 10 00:07:24 crc kubenswrapper[4906]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 00:07:24 crc kubenswrapper[4906]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 00:07:24 crc kubenswrapper[4906]: --disable-webhook \ Mar 10 00:07:24 crc kubenswrapper[4906]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 00:07:24 crc kubenswrapper[4906]: --loglevel="${LOGLEVEL}" Mar 10 00:07:24 crc kubenswrapper[4906]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 00:07:24 crc kubenswrapper[4906]: > logger="UnhandledError" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.885048 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.885803 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.887204 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.890867 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c95a91e1eed6b2ff9de9f395975b5f05b6c9f06056c020584f3c6a227c83ff93"} Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.893029 4906 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 00:07:24 crc kubenswrapper[4906]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 00:07:24 crc kubenswrapper[4906]: set -o allexport Mar 10 00:07:24 crc kubenswrapper[4906]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 00:07:24 crc kubenswrapper[4906]: source /etc/kubernetes/apiserver-url.env Mar 10 00:07:24 crc kubenswrapper[4906]: else Mar 10 00:07:24 crc kubenswrapper[4906]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 00:07:24 crc kubenswrapper[4906]: exit 1 Mar 10 00:07:24 crc kubenswrapper[4906]: fi Mar 10 00:07:24 crc kubenswrapper[4906]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 00:07:24 crc kubenswrapper[4906]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 00:07:24 crc kubenswrapper[4906]: > logger="UnhandledError" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.893121 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d47500314598b54c196aec63bc5eae1917c20d2de347303e967f6046ce440a3f"} Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.894277 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.894618 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.894664 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cba88e7b36a1232c3092aacaa48ef9b564cad0fba41b6df70a7c8893c50d7644"} Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.895717 4906 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 00:07:24 crc kubenswrapper[4906]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 00:07:24 crc kubenswrapper[4906]: if [[ -f "/env/_master" ]]; then Mar 10 00:07:24 crc kubenswrapper[4906]: set -o allexport Mar 10 00:07:24 crc kubenswrapper[4906]: source "/env/_master" Mar 10 00:07:24 crc kubenswrapper[4906]: set +o allexport Mar 10 00:07:24 crc kubenswrapper[4906]: fi Mar 10 00:07:24 crc kubenswrapper[4906]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 00:07:24 crc kubenswrapper[4906]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 00:07:24 crc kubenswrapper[4906]: ho_enable="--enable-hybrid-overlay" Mar 10 00:07:24 crc kubenswrapper[4906]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 00:07:24 crc kubenswrapper[4906]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 00:07:24 crc kubenswrapper[4906]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 00:07:24 crc kubenswrapper[4906]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 00:07:24 crc kubenswrapper[4906]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 00:07:24 crc kubenswrapper[4906]: --webhook-host=127.0.0.1 \ Mar 10 00:07:24 crc kubenswrapper[4906]: --webhook-port=9743 \ Mar 10 00:07:24 crc kubenswrapper[4906]: ${ho_enable} \ Mar 10 00:07:24 crc kubenswrapper[4906]: --enable-interconnect \ Mar 10 00:07:24 crc kubenswrapper[4906]: --disable-approver \ Mar 10 00:07:24 crc kubenswrapper[4906]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 00:07:24 crc kubenswrapper[4906]: --wait-for-kubernetes-api=200s \ Mar 10 00:07:24 crc kubenswrapper[4906]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 00:07:24 crc kubenswrapper[4906]: --loglevel="${LOGLEVEL}" Mar 10 00:07:24 crc kubenswrapper[4906]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 00:07:24 crc kubenswrapper[4906]: > logger="UnhandledError" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.895766 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.897648 4906 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 00:07:24 crc kubenswrapper[4906]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 00:07:24 crc kubenswrapper[4906]: if [[ -f "/env/_master" ]]; then Mar 10 00:07:24 crc kubenswrapper[4906]: set -o allexport Mar 10 00:07:24 crc kubenswrapper[4906]: source "/env/_master" Mar 10 00:07:24 crc kubenswrapper[4906]: set +o allexport Mar 10 00:07:24 crc kubenswrapper[4906]: fi Mar 10 00:07:24 crc kubenswrapper[4906]: Mar 10 00:07:24 crc kubenswrapper[4906]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 00:07:24 crc kubenswrapper[4906]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 00:07:24 crc kubenswrapper[4906]: --disable-webhook \ Mar 10 00:07:24 crc kubenswrapper[4906]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 00:07:24 crc kubenswrapper[4906]: --loglevel="${LOGLEVEL}" Mar 10 00:07:24 crc kubenswrapper[4906]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 00:07:24 crc kubenswrapper[4906]: > logger="UnhandledError" Mar 10 00:07:24 crc kubenswrapper[4906]: E0310 00:07:24.899507 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.899808 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.909172 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.920963 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.930339 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.947968 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.958647 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.962771 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.962894 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.962977 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.963073 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.963199 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:24Z","lastTransitionTime":"2026-03-10T00:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.974874 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.986308 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:24 crc kubenswrapper[4906]: I0310 00:07:24.999124 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.015993 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.026180 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.035215 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.048058 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.059837 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.066414 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.066448 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.066464 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.066488 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.066499 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:25Z","lastTransitionTime":"2026-03-10T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.169176 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.169217 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.169233 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.169254 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.169267 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:25Z","lastTransitionTime":"2026-03-10T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.222901 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:25 crc kubenswrapper[4906]: E0310 00:07:25.223084 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:07:26.223062942 +0000 UTC m=+72.370958074 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.272286 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.272317 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.272328 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.272345 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.272357 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:25Z","lastTransitionTime":"2026-03-10T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.323806 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.323851 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.323881 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.323908 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:25 crc kubenswrapper[4906]: E0310 00:07:25.324048 4906 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:25 crc kubenswrapper[4906]: E0310 00:07:25.324102 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:26.32408819 +0000 UTC m=+72.471983312 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:25 crc kubenswrapper[4906]: E0310 00:07:25.324495 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:25 crc kubenswrapper[4906]: E0310 00:07:25.324518 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:25 crc kubenswrapper[4906]: E0310 00:07:25.324533 4906 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:25 crc kubenswrapper[4906]: E0310 00:07:25.324569 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:26.324557123 +0000 UTC m=+72.472452245 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:25 crc kubenswrapper[4906]: E0310 00:07:25.324628 4906 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:25 crc kubenswrapper[4906]: E0310 00:07:25.324682 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:26.324672897 +0000 UTC m=+72.472568029 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:25 crc kubenswrapper[4906]: E0310 00:07:25.324744 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:25 crc kubenswrapper[4906]: E0310 00:07:25.324758 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:25 crc kubenswrapper[4906]: E0310 00:07:25.324768 4906 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:25 crc kubenswrapper[4906]: E0310 00:07:25.324797 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:26.32478919 +0000 UTC m=+72.472684322 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.375622 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.375687 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.375699 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.375718 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.375732 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:25Z","lastTransitionTime":"2026-03-10T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.478591 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.478652 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.478666 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.478685 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.478702 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:25Z","lastTransitionTime":"2026-03-10T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.581922 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.581965 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.581977 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.581998 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.582011 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:25Z","lastTransitionTime":"2026-03-10T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.685323 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.685372 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.685390 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.685416 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.685437 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:25Z","lastTransitionTime":"2026-03-10T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.788076 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.788133 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.788146 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.788162 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.788173 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:25Z","lastTransitionTime":"2026-03-10T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.891059 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.891128 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.891145 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.891170 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.891189 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:25Z","lastTransitionTime":"2026-03-10T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.899863 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.902218 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be"} Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.902554 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.925628 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.940877 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.958012 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.971983 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.987905 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.994055 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.994115 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.994143 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.994180 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.994211 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:25Z","lastTransitionTime":"2026-03-10T00:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:25 crc kubenswrapper[4906]: I0310 00:07:25.999395 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.010111 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.097883 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.097948 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.097956 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.098045 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.098062 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:26Z","lastTransitionTime":"2026-03-10T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.201713 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.201772 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.201783 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.201804 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.201815 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:26Z","lastTransitionTime":"2026-03-10T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.233670 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:26 crc kubenswrapper[4906]: E0310 00:07:26.233841 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:07:28.233818333 +0000 UTC m=+74.381713445 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.304750 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.305091 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.305101 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.305122 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.305133 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:26Z","lastTransitionTime":"2026-03-10T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.334759 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.334831 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.334862 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.334890 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:26 crc kubenswrapper[4906]: E0310 00:07:26.334908 4906 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:26 crc kubenswrapper[4906]: E0310 00:07:26.334994 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:28.334953713 +0000 UTC m=+74.482848815 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:26 crc kubenswrapper[4906]: E0310 00:07:26.335007 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:26 crc kubenswrapper[4906]: E0310 00:07:26.335025 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:26 crc kubenswrapper[4906]: E0310 00:07:26.335038 4906 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:26 crc kubenswrapper[4906]: E0310 00:07:26.335083 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:28.335071756 +0000 UTC m=+74.482966878 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:26 crc kubenswrapper[4906]: E0310 00:07:26.335115 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:26 crc kubenswrapper[4906]: E0310 00:07:26.335141 4906 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:26 crc kubenswrapper[4906]: E0310 00:07:26.335168 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:26 crc kubenswrapper[4906]: E0310 00:07:26.335190 4906 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:26 crc kubenswrapper[4906]: E0310 00:07:26.335259 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:28.335231471 +0000 UTC m=+74.483126593 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:26 crc kubenswrapper[4906]: E0310 00:07:26.335284 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:28.335272972 +0000 UTC m=+74.483168094 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.407938 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.407973 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.407982 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.408004 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.408014 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:26Z","lastTransitionTime":"2026-03-10T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.510765 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.510849 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.510868 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.510901 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.510921 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:26Z","lastTransitionTime":"2026-03-10T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.576042 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:26 crc kubenswrapper[4906]: E0310 00:07:26.576294 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.577012 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.577195 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:26 crc kubenswrapper[4906]: E0310 00:07:26.577433 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:26 crc kubenswrapper[4906]: E0310 00:07:26.577604 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.583609 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.584321 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.584991 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.585619 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.586196 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.586686 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.587268 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.587799 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.588386 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.588884 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.589348 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.590023 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.590517 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.591042 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.591541 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.592048 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.592574 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.592975 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.596273 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.596825 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.597627 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.598173 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.598580 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.599525 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.600001 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.601010 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.601770 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.602661 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.603275 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.604542 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.605788 4906 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.606072 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.609407 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.611967 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.613411 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.613938 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.613991 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.614010 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.614052 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.614073 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:26Z","lastTransitionTime":"2026-03-10T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.616172 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.618014 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.618809 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.620358 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.621785 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.624394 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.627079 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.628507 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.631208 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.632299 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.634549 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.636007 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.638218 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.638762 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.639971 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.640525 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.641136 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.642237 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.642748 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.717917 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.717992 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.718008 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.718036 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.718053 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:26Z","lastTransitionTime":"2026-03-10T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.821533 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.821690 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.821711 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.821732 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.821748 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:26Z","lastTransitionTime":"2026-03-10T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.926366 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.926460 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.926481 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.926515 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:26 crc kubenswrapper[4906]: I0310 00:07:26.926537 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:26Z","lastTransitionTime":"2026-03-10T00:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.029658 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.029763 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.029783 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.029813 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.029836 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:27Z","lastTransitionTime":"2026-03-10T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.133191 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.133324 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.133378 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.133422 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.133451 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:27Z","lastTransitionTime":"2026-03-10T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.238085 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.238155 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.238173 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.238202 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.238222 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:27Z","lastTransitionTime":"2026-03-10T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.340866 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.340928 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.340939 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.340959 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.340970 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:27Z","lastTransitionTime":"2026-03-10T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.443272 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.443326 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.443336 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.443352 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.443361 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:27Z","lastTransitionTime":"2026-03-10T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.545714 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.545763 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.545774 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.545794 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.545807 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:27Z","lastTransitionTime":"2026-03-10T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.649190 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.649257 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.649269 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.649288 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.649299 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:27Z","lastTransitionTime":"2026-03-10T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.752576 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.752657 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.752670 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.752693 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.752711 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:27Z","lastTransitionTime":"2026-03-10T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.855783 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.855822 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.855831 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.855847 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.855858 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:27Z","lastTransitionTime":"2026-03-10T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.959072 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.959114 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.959126 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.959154 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:27 crc kubenswrapper[4906]: I0310 00:07:27.959168 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:27Z","lastTransitionTime":"2026-03-10T00:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.061863 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.061942 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.061968 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.062011 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.062035 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:28Z","lastTransitionTime":"2026-03-10T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.165450 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.165556 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.165570 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.165594 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.165612 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:28Z","lastTransitionTime":"2026-03-10T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.252033 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:28 crc kubenswrapper[4906]: E0310 00:07:28.252378 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:07:32.252330596 +0000 UTC m=+78.400225728 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.268280 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.268331 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.268345 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.268368 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.268383 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:28Z","lastTransitionTime":"2026-03-10T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.353526 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.353593 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.353664 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.353708 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:28 crc kubenswrapper[4906]: E0310 00:07:28.353761 4906 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:28 crc kubenswrapper[4906]: E0310 00:07:28.353792 4906 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:28 crc kubenswrapper[4906]: E0310 00:07:28.353793 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:28 crc kubenswrapper[4906]: E0310 00:07:28.353827 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:28 crc kubenswrapper[4906]: E0310 00:07:28.353849 4906 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:28 crc kubenswrapper[4906]: E0310 00:07:28.353854 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:32.353828167 +0000 UTC m=+78.501723299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:28 crc kubenswrapper[4906]: E0310 00:07:28.353912 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:32.353895019 +0000 UTC m=+78.501790151 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:28 crc kubenswrapper[4906]: E0310 00:07:28.353942 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:32.35393151 +0000 UTC m=+78.501826632 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:28 crc kubenswrapper[4906]: E0310 00:07:28.354010 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:28 crc kubenswrapper[4906]: E0310 00:07:28.354049 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:28 crc kubenswrapper[4906]: E0310 00:07:28.354061 4906 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:28 crc kubenswrapper[4906]: E0310 00:07:28.354181 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:32.354144397 +0000 UTC m=+78.502039509 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.371542 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.371580 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.371609 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.371630 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.371671 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:28Z","lastTransitionTime":"2026-03-10T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.474975 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.475054 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.475072 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.475123 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.475140 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:28Z","lastTransitionTime":"2026-03-10T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.576365 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.576433 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:28 crc kubenswrapper[4906]: E0310 00:07:28.576705 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.576737 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:28 crc kubenswrapper[4906]: E0310 00:07:28.576877 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:28 crc kubenswrapper[4906]: E0310 00:07:28.576980 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.578154 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.578237 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.578252 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.578296 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.578309 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:28Z","lastTransitionTime":"2026-03-10T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.682237 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.682328 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.682346 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.682407 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.682426 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:28Z","lastTransitionTime":"2026-03-10T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.785805 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.785857 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.785872 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.785898 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.785915 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:28Z","lastTransitionTime":"2026-03-10T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.889338 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.889412 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.889436 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.889458 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.889472 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:28Z","lastTransitionTime":"2026-03-10T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.992514 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.992564 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.992576 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.992597 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:28 crc kubenswrapper[4906]: I0310 00:07:28.992609 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:28Z","lastTransitionTime":"2026-03-10T00:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.095778 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.095835 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.095849 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.095874 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.095892 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:29Z","lastTransitionTime":"2026-03-10T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.160337 4906 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.199120 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.199186 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.199199 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.199225 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.199241 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:29Z","lastTransitionTime":"2026-03-10T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.301495 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.301538 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.301554 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.301577 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.301592 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:29Z","lastTransitionTime":"2026-03-10T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.404324 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.404378 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.404391 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.404412 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.404430 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:29Z","lastTransitionTime":"2026-03-10T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.507514 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.507562 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.507573 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.507593 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.507605 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:29Z","lastTransitionTime":"2026-03-10T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.610775 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.610827 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.610838 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.610859 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.610871 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:29Z","lastTransitionTime":"2026-03-10T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.713945 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.713999 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.714016 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.714043 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.714061 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:29Z","lastTransitionTime":"2026-03-10T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.817266 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.817344 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.817363 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.817393 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.817412 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:29Z","lastTransitionTime":"2026-03-10T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.919938 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.919980 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.919991 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.920008 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:29 crc kubenswrapper[4906]: I0310 00:07:29.920019 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:29Z","lastTransitionTime":"2026-03-10T00:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.023741 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.023791 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.023807 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.023830 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.023844 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:30Z","lastTransitionTime":"2026-03-10T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.127254 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.127304 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.127317 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.127337 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.127350 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:30Z","lastTransitionTime":"2026-03-10T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.229803 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.229844 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.229855 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.229875 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.229887 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:30Z","lastTransitionTime":"2026-03-10T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.332623 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.332714 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.332766 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.332790 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.332808 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:30Z","lastTransitionTime":"2026-03-10T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.435095 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.435148 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.435160 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.435184 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.435197 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:30Z","lastTransitionTime":"2026-03-10T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.537743 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.537796 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.537830 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.537855 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.537868 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:30Z","lastTransitionTime":"2026-03-10T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.575716 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.575787 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.575827 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:30 crc kubenswrapper[4906]: E0310 00:07:30.575858 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:30 crc kubenswrapper[4906]: E0310 00:07:30.575965 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:30 crc kubenswrapper[4906]: E0310 00:07:30.576089 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.640716 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.640756 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.640767 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.640789 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.640801 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:30Z","lastTransitionTime":"2026-03-10T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.743624 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.743747 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.743763 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.743782 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.743794 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:30Z","lastTransitionTime":"2026-03-10T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.847453 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.847511 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.847525 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.847548 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.847562 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:30Z","lastTransitionTime":"2026-03-10T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.949880 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.950169 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.950267 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.950358 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:30 crc kubenswrapper[4906]: I0310 00:07:30.950423 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:30Z","lastTransitionTime":"2026-03-10T00:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.053272 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.053572 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.053654 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.053761 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.053854 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:31Z","lastTransitionTime":"2026-03-10T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.156433 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.156469 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.156495 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.156513 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.156522 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:31Z","lastTransitionTime":"2026-03-10T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.258777 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.258829 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.258841 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.258861 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.258874 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:31Z","lastTransitionTime":"2026-03-10T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.361616 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.361677 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.361691 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.361719 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.361734 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:31Z","lastTransitionTime":"2026-03-10T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.465623 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.465702 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.465719 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.465738 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.465758 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:31Z","lastTransitionTime":"2026-03-10T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.568119 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.568180 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.568191 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.568209 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.568223 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:31Z","lastTransitionTime":"2026-03-10T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.670749 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.670794 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.670803 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.670822 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.670832 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:31Z","lastTransitionTime":"2026-03-10T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.686864 4906 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.773696 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.773739 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.773749 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.773767 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.773778 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:31Z","lastTransitionTime":"2026-03-10T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.875813 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.875841 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.875853 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.875867 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.875877 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:31Z","lastTransitionTime":"2026-03-10T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.978678 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.978729 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.978747 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.978780 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:31 crc kubenswrapper[4906]: I0310 00:07:31.978803 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:31Z","lastTransitionTime":"2026-03-10T00:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.082082 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.082143 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.082160 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.082186 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.082204 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:32Z","lastTransitionTime":"2026-03-10T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.185029 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.185097 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.185115 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.185140 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.185157 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:32Z","lastTransitionTime":"2026-03-10T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.288301 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.288418 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.288442 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.288478 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.288506 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:32Z","lastTransitionTime":"2026-03-10T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.290467 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:32 crc kubenswrapper[4906]: E0310 00:07:32.290735 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:07:40.290702354 +0000 UTC m=+86.438597506 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.390887 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.390919 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.390929 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.390944 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.390953 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:32Z","lastTransitionTime":"2026-03-10T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.391043 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.391091 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.391132 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.391171 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:32 crc kubenswrapper[4906]: E0310 00:07:32.391189 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:32 crc kubenswrapper[4906]: E0310 00:07:32.391240 4906 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:32 crc kubenswrapper[4906]: E0310 00:07:32.391244 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:32 crc kubenswrapper[4906]: E0310 00:07:32.391266 4906 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:32 crc kubenswrapper[4906]: E0310 00:07:32.391289 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:40.391273398 +0000 UTC m=+86.539168510 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:32 crc kubenswrapper[4906]: E0310 00:07:32.391320 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:40.391302119 +0000 UTC m=+86.539197271 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:32 crc kubenswrapper[4906]: E0310 00:07:32.391360 4906 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:32 crc kubenswrapper[4906]: E0310 00:07:32.391417 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:32 crc kubenswrapper[4906]: E0310 00:07:32.391467 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:32 crc kubenswrapper[4906]: E0310 00:07:32.391499 4906 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:32 crc kubenswrapper[4906]: E0310 00:07:32.391529 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:40.391485554 +0000 UTC m=+86.539380706 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:32 crc kubenswrapper[4906]: E0310 00:07:32.391581 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:40.391554906 +0000 UTC m=+86.539450068 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.493835 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.493894 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.493908 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.493934 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.493946 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:32Z","lastTransitionTime":"2026-03-10T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.576519 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.576705 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.576859 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:32 crc kubenswrapper[4906]: E0310 00:07:32.576942 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:32 crc kubenswrapper[4906]: E0310 00:07:32.576726 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:32 crc kubenswrapper[4906]: E0310 00:07:32.577032 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.597024 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.597099 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.597127 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.597161 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.597186 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:32Z","lastTransitionTime":"2026-03-10T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.700470 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.700536 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.700556 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.700581 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.700598 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:32Z","lastTransitionTime":"2026-03-10T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.803179 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.803216 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.803228 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.803245 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.803257 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:32Z","lastTransitionTime":"2026-03-10T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.905614 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.905665 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.905673 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.905690 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:32 crc kubenswrapper[4906]: I0310 00:07:32.905699 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:32Z","lastTransitionTime":"2026-03-10T00:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.007706 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.007740 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.007749 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.007767 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.007777 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:33Z","lastTransitionTime":"2026-03-10T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.109903 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.109981 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.110010 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.110041 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.110062 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:33Z","lastTransitionTime":"2026-03-10T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.212101 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.212470 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.212722 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.212946 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.213180 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:33Z","lastTransitionTime":"2026-03-10T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.316808 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.317615 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.317705 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.317739 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.317768 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:33Z","lastTransitionTime":"2026-03-10T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.420013 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.420253 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.420330 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.420365 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.420390 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:33Z","lastTransitionTime":"2026-03-10T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.522910 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.522948 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.522958 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.522975 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.522987 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:33Z","lastTransitionTime":"2026-03-10T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.626107 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.626166 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.626178 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.626199 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.626216 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:33Z","lastTransitionTime":"2026-03-10T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.729619 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.729737 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.729760 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.729796 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.729820 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:33Z","lastTransitionTime":"2026-03-10T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.832666 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.832714 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.832726 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.832753 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.832766 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:33Z","lastTransitionTime":"2026-03-10T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.935176 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.935232 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.935247 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.935275 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:33 crc kubenswrapper[4906]: I0310 00:07:33.935296 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:33Z","lastTransitionTime":"2026-03-10T00:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.036917 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.036955 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.036963 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.036979 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.036989 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:34Z","lastTransitionTime":"2026-03-10T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.139213 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.139246 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.139256 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.139271 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.139282 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:34Z","lastTransitionTime":"2026-03-10T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.241624 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.241691 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.241701 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.241720 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.241731 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:34Z","lastTransitionTime":"2026-03-10T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.343751 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.343789 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.343799 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.343815 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.343827 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:34Z","lastTransitionTime":"2026-03-10T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.445995 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.446032 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.446042 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.446057 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.446067 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:34Z","lastTransitionTime":"2026-03-10T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.533080 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.533383 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.533396 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.533436 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.533456 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:34Z","lastTransitionTime":"2026-03-10T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:34 crc kubenswrapper[4906]: E0310 00:07:34.546901 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.551032 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.551062 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.551071 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.551088 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.551098 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:34Z","lastTransitionTime":"2026-03-10T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:34 crc kubenswrapper[4906]: E0310 00:07:34.564254 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.567994 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.568024 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.568033 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.568045 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.568055 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:34Z","lastTransitionTime":"2026-03-10T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.576334 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.576364 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.576417 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:34 crc kubenswrapper[4906]: E0310 00:07:34.576476 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:34 crc kubenswrapper[4906]: E0310 00:07:34.576609 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:34 crc kubenswrapper[4906]: E0310 00:07:34.576796 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:34 crc kubenswrapper[4906]: E0310 00:07:34.582972 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.586524 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.586557 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.586565 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.586580 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.586590 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:34Z","lastTransitionTime":"2026-03-10T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.591788 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:34 crc kubenswrapper[4906]: E0310 00:07:34.600368 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.603912 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.603943 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.603951 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.603967 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.603962 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.603977 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:34Z","lastTransitionTime":"2026-03-10T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.614155 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:34 crc kubenswrapper[4906]: E0310 00:07:34.621230 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:34 crc kubenswrapper[4906]: E0310 00:07:34.621335 4906 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.622783 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.622844 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.622859 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.622881 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.622895 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:34Z","lastTransitionTime":"2026-03-10T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.625580 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.640775 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.651775 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.667703 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.726428 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.726471 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.726481 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.726504 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.726515 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:34Z","lastTransitionTime":"2026-03-10T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.830177 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.830218 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.830228 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.830245 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.830257 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:34Z","lastTransitionTime":"2026-03-10T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.932783 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.932828 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.932836 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.932851 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:34 crc kubenswrapper[4906]: I0310 00:07:34.932868 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:34Z","lastTransitionTime":"2026-03-10T00:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.035901 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.035982 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.036002 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.036034 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.036058 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:35Z","lastTransitionTime":"2026-03-10T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.139938 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.139982 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.139991 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.140010 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.140025 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:35Z","lastTransitionTime":"2026-03-10T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.243367 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.243432 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.243451 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.243478 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.243499 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:35Z","lastTransitionTime":"2026-03-10T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.346752 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.346820 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.346842 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.347043 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.347098 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:35Z","lastTransitionTime":"2026-03-10T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.450001 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.450102 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.450125 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.450160 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.450186 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:35Z","lastTransitionTime":"2026-03-10T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.553171 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.553235 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.553251 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.553279 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.553296 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:35Z","lastTransitionTime":"2026-03-10T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.656973 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.657048 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.657065 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.657093 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.657113 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:35Z","lastTransitionTime":"2026-03-10T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.759523 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.759568 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.759579 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.759600 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.759611 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:35Z","lastTransitionTime":"2026-03-10T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.862487 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.862556 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.862574 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.862605 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.862622 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:35Z","lastTransitionTime":"2026-03-10T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.964782 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.964813 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.964823 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.964837 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:35 crc kubenswrapper[4906]: I0310 00:07:35.964846 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:35Z","lastTransitionTime":"2026-03-10T00:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.067855 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.067897 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.067907 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.067927 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.067938 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.170186 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.170245 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.170263 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.170291 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.170320 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.272810 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.272878 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.272901 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.272951 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.272973 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.376793 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.376831 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.376839 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.376858 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.376869 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.479272 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.479340 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.479356 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.479380 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.479393 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.576341 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.576418 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.576575 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:36 crc kubenswrapper[4906]: E0310 00:07:36.576962 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:36 crc kubenswrapper[4906]: E0310 00:07:36.577185 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:36 crc kubenswrapper[4906]: E0310 00:07:36.577308 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.581591 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.581629 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.581662 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.581677 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.581689 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.685494 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.685563 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.685583 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.685616 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.685668 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.788935 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.789182 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.789212 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.789251 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.789277 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.891570 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.891613 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.891628 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.891669 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.891682 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.937043 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd"} Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.937122 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef"} Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.954606 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.968384 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.981959 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.993883 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.994398 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.994433 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.994445 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.994468 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:36 crc kubenswrapper[4906]: I0310 00:07:36.994480 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:36Z","lastTransitionTime":"2026-03-10T00:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.010849 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.024590 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.041671 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.097776 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.097854 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.097874 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.097908 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.097932 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:37Z","lastTransitionTime":"2026-03-10T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.201972 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.202029 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.202042 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.202067 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.202082 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:37Z","lastTransitionTime":"2026-03-10T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.305311 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.305361 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.305375 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.305392 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.305402 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:37Z","lastTransitionTime":"2026-03-10T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.408519 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.408570 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.408583 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.408602 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.408616 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:37Z","lastTransitionTime":"2026-03-10T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.512039 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.512072 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.512082 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.512097 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.512106 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:37Z","lastTransitionTime":"2026-03-10T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.528381 4906 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.615129 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.615177 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.615186 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.615209 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.615220 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:37Z","lastTransitionTime":"2026-03-10T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.718266 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.718311 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.718323 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.718339 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.718352 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:37Z","lastTransitionTime":"2026-03-10T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.820845 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.820882 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.820891 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.820908 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.820920 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:37Z","lastTransitionTime":"2026-03-10T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.923069 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.923134 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.923150 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.923182 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:37 crc kubenswrapper[4906]: I0310 00:07:37.923196 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:37Z","lastTransitionTime":"2026-03-10T00:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.026450 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.026496 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.026505 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.026522 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.026537 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:38Z","lastTransitionTime":"2026-03-10T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.128872 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.128923 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.128935 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.128955 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.128969 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:38Z","lastTransitionTime":"2026-03-10T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.232226 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.232299 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.232321 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.232354 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.232378 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:38Z","lastTransitionTime":"2026-03-10T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.334923 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.334975 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.334983 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.335001 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.335012 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:38Z","lastTransitionTime":"2026-03-10T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.437676 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.437749 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.437768 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.437799 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.437821 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:38Z","lastTransitionTime":"2026-03-10T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.541053 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.541122 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.541149 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.541178 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.541189 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:38Z","lastTransitionTime":"2026-03-10T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.576666 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.576749 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:38 crc kubenswrapper[4906]: E0310 00:07:38.576862 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:38 crc kubenswrapper[4906]: E0310 00:07:38.577175 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.577244 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:38 crc kubenswrapper[4906]: E0310 00:07:38.577512 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.643664 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.643715 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.643727 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.643746 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.643759 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:38Z","lastTransitionTime":"2026-03-10T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.747148 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.747196 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.747205 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.747224 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.747234 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:38Z","lastTransitionTime":"2026-03-10T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.850213 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.850262 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.850272 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.850289 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.850305 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:38Z","lastTransitionTime":"2026-03-10T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.952613 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.952705 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.952723 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.952756 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:38 crc kubenswrapper[4906]: I0310 00:07:38.952779 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:38Z","lastTransitionTime":"2026-03-10T00:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.059933 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.060006 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.060028 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.060061 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.060083 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:39Z","lastTransitionTime":"2026-03-10T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.164088 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.164132 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.164146 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.164190 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.164202 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:39Z","lastTransitionTime":"2026-03-10T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.266781 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.266862 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.266880 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.266904 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.266923 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:39Z","lastTransitionTime":"2026-03-10T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.369382 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.369428 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.369439 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.369458 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.369470 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:39Z","lastTransitionTime":"2026-03-10T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.472612 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.472696 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.472708 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.472727 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.472740 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:39Z","lastTransitionTime":"2026-03-10T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.543815 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.560076 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.574698 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.575211 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.575266 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.575285 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.575309 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.575329 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:39Z","lastTransitionTime":"2026-03-10T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.592270 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.607159 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.620946 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.633198 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.652591 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:39Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.677878 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.677938 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.677950 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.677974 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.677988 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:39Z","lastTransitionTime":"2026-03-10T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.780885 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.780932 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.780945 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.780966 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.780980 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:39Z","lastTransitionTime":"2026-03-10T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.883528 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.883575 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.883584 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.883601 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.883612 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:39Z","lastTransitionTime":"2026-03-10T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.986696 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.986749 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.986761 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.986784 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:39 crc kubenswrapper[4906]: I0310 00:07:39.986800 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:39Z","lastTransitionTime":"2026-03-10T00:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.089156 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.089198 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.089207 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.089222 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.089234 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:40Z","lastTransitionTime":"2026-03-10T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.192337 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.192380 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.192390 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.192406 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.192417 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:40Z","lastTransitionTime":"2026-03-10T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.295345 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.295401 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.295420 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.295446 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.295464 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:40Z","lastTransitionTime":"2026-03-10T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.365264 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:40 crc kubenswrapper[4906]: E0310 00:07:40.365612 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:07:56.365561541 +0000 UTC m=+102.513456693 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.398428 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.398494 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.398522 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.398607 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.398628 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:40Z","lastTransitionTime":"2026-03-10T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.466232 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.466291 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.466313 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.466335 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:40 crc kubenswrapper[4906]: E0310 00:07:40.466472 4906 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:40 crc kubenswrapper[4906]: E0310 00:07:40.466549 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:56.466532617 +0000 UTC m=+102.614427729 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:40 crc kubenswrapper[4906]: E0310 00:07:40.466733 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:40 crc kubenswrapper[4906]: E0310 00:07:40.466824 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:40 crc kubenswrapper[4906]: E0310 00:07:40.466838 4906 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:40 crc kubenswrapper[4906]: E0310 00:07:40.466852 4906 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:40 crc kubenswrapper[4906]: E0310 00:07:40.467028 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:56.466941389 +0000 UTC m=+102.614836531 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:40 crc kubenswrapper[4906]: E0310 00:07:40.466741 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:40 crc kubenswrapper[4906]: E0310 00:07:40.467079 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:40 crc kubenswrapper[4906]: E0310 00:07:40.467102 4906 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:40 crc kubenswrapper[4906]: E0310 00:07:40.467137 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:56.467047282 +0000 UTC m=+102.614942424 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:40 crc kubenswrapper[4906]: E0310 00:07:40.467192 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:07:56.467168366 +0000 UTC m=+102.615063688 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.501454 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.501517 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.501529 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.501554 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.501570 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:40Z","lastTransitionTime":"2026-03-10T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.575848 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.575919 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.575975 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:40 crc kubenswrapper[4906]: E0310 00:07:40.576156 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:40 crc kubenswrapper[4906]: E0310 00:07:40.576378 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:40 crc kubenswrapper[4906]: E0310 00:07:40.576812 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.609995 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.610059 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.610081 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.610115 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.610138 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:40Z","lastTransitionTime":"2026-03-10T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.713366 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.713444 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.713458 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.713490 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.713505 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:40Z","lastTransitionTime":"2026-03-10T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.817190 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.817262 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.817280 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.817312 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.817334 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:40Z","lastTransitionTime":"2026-03-10T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.920208 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.920273 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.920298 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.920321 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.920336 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:40Z","lastTransitionTime":"2026-03-10T00:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.949482 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2"} Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.973595 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:40 crc kubenswrapper[4906]: I0310 00:07:40.991799 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:40Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.011780 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.023041 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.023102 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.023112 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.023136 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.023154 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:41Z","lastTransitionTime":"2026-03-10T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.029066 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.052869 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.074387 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.091532 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.125859 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.126294 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.126362 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.126449 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.126540 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:41Z","lastTransitionTime":"2026-03-10T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.229112 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.229426 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.229579 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.229755 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.229898 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:41Z","lastTransitionTime":"2026-03-10T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.333084 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.333151 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.333161 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.333181 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.333192 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:41Z","lastTransitionTime":"2026-03-10T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.437004 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.437445 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.437708 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.437996 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.438380 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:41Z","lastTransitionTime":"2026-03-10T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.540895 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.541511 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.541531 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.541572 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.541588 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:41Z","lastTransitionTime":"2026-03-10T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.645098 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.645178 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.645195 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.645220 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.645236 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:41Z","lastTransitionTime":"2026-03-10T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.748228 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.748314 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.748345 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.748381 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.748407 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:41Z","lastTransitionTime":"2026-03-10T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.850875 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.850965 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.850988 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.851021 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.851042 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:41Z","lastTransitionTime":"2026-03-10T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.953097 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.953143 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.953153 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.953172 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.953183 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:41Z","lastTransitionTime":"2026-03-10T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.955523 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db"} Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.976101 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:41 crc kubenswrapper[4906]: I0310 00:07:41.992681 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:41Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.010521 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.029930 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.053324 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.056214 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.056275 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.056293 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.056318 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.056339 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:42Z","lastTransitionTime":"2026-03-10T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.069383 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.085443 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:42Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.159567 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.159621 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.159659 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.159687 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.159702 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:42Z","lastTransitionTime":"2026-03-10T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.262816 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.262865 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.262880 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.262907 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.262920 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:42Z","lastTransitionTime":"2026-03-10T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.365537 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.365568 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.365576 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.365590 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.365598 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:42Z","lastTransitionTime":"2026-03-10T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.468176 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.468220 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.468234 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.468253 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.468264 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:42Z","lastTransitionTime":"2026-03-10T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.570624 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.570667 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.570677 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.570693 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.570702 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:42Z","lastTransitionTime":"2026-03-10T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.576831 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:42 crc kubenswrapper[4906]: E0310 00:07:42.576968 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.577623 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:42 crc kubenswrapper[4906]: E0310 00:07:42.577703 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.577752 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:42 crc kubenswrapper[4906]: E0310 00:07:42.577792 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.673185 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.673243 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.673256 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.673278 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.673291 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:42Z","lastTransitionTime":"2026-03-10T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.776564 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.776619 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.776645 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.776665 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.776677 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:42Z","lastTransitionTime":"2026-03-10T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.879200 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.879249 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.879266 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.879289 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.879305 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:42Z","lastTransitionTime":"2026-03-10T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.981804 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.981853 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.981865 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.981888 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:42 crc kubenswrapper[4906]: I0310 00:07:42.981907 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:42Z","lastTransitionTime":"2026-03-10T00:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.084776 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.084826 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.084836 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.084858 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.084872 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:43Z","lastTransitionTime":"2026-03-10T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.187815 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.187874 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.187883 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.187901 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.187911 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:43Z","lastTransitionTime":"2026-03-10T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.290506 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.290545 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.290552 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.290567 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.290576 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:43Z","lastTransitionTime":"2026-03-10T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.392976 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.393025 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.393034 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.393050 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.393061 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:43Z","lastTransitionTime":"2026-03-10T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.495993 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.496055 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.496072 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.496100 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.496118 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:43Z","lastTransitionTime":"2026-03-10T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.598619 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.598683 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.598710 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.598729 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.598739 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:43Z","lastTransitionTime":"2026-03-10T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.701436 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.701744 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.701806 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.701868 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.701941 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:43Z","lastTransitionTime":"2026-03-10T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.805871 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.805920 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.805930 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.805950 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.805960 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:43Z","lastTransitionTime":"2026-03-10T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.907920 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.907964 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.907973 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.907994 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:43 crc kubenswrapper[4906]: I0310 00:07:43.908005 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:43Z","lastTransitionTime":"2026-03-10T00:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.011216 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.011267 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.011277 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.011296 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.011312 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:44Z","lastTransitionTime":"2026-03-10T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.114378 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.114451 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.114475 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.114506 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.114531 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:44Z","lastTransitionTime":"2026-03-10T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.218279 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.218356 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.218372 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.218395 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.218411 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:44Z","lastTransitionTime":"2026-03-10T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.321255 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.321310 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.321320 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.321340 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.321351 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:44Z","lastTransitionTime":"2026-03-10T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.424993 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.425039 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.425050 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.425069 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.425078 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:44Z","lastTransitionTime":"2026-03-10T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.528345 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.528398 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.528409 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.528428 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.528440 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:44Z","lastTransitionTime":"2026-03-10T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.576154 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.576243 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:44 crc kubenswrapper[4906]: E0310 00:07:44.576307 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:44 crc kubenswrapper[4906]: E0310 00:07:44.576410 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.576490 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:44 crc kubenswrapper[4906]: E0310 00:07:44.576553 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.594586 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.612274 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.630036 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.630985 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.631030 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.631046 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.631095 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.631108 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:44Z","lastTransitionTime":"2026-03-10T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.646109 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.661616 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.683218 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.702992 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.733993 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.734266 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.734355 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.734492 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.734569 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:44Z","lastTransitionTime":"2026-03-10T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.837241 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.837295 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.837309 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.837334 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.837347 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:44Z","lastTransitionTime":"2026-03-10T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.932699 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.933008 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.933071 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.933134 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.933201 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:44Z","lastTransitionTime":"2026-03-10T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:44 crc kubenswrapper[4906]: E0310 00:07:44.951752 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.955903 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.956104 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.956188 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.956284 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.956369 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:44Z","lastTransitionTime":"2026-03-10T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:44 crc kubenswrapper[4906]: E0310 00:07:44.980100 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.984596 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.984776 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.984880 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.984985 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:44 crc kubenswrapper[4906]: I0310 00:07:44.985070 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:44Z","lastTransitionTime":"2026-03-10T00:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:45 crc kubenswrapper[4906]: E0310 00:07:45.001236 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.006092 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.006252 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.006345 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.006452 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.006536 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:45Z","lastTransitionTime":"2026-03-10T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:45 crc kubenswrapper[4906]: E0310 00:07:45.022559 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.027186 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.027220 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.027239 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.027256 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.027278 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:45Z","lastTransitionTime":"2026-03-10T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:45 crc kubenswrapper[4906]: E0310 00:07:45.041348 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:45Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:45 crc kubenswrapper[4906]: E0310 00:07:45.041472 4906 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.043817 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.043867 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.043882 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.043908 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.043928 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:45Z","lastTransitionTime":"2026-03-10T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.147810 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.147861 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.147872 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.147892 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.147903 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:45Z","lastTransitionTime":"2026-03-10T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.251070 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.251117 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.251132 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.251154 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.251167 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:45Z","lastTransitionTime":"2026-03-10T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.355194 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.355240 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.355256 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.355279 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.355293 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:45Z","lastTransitionTime":"2026-03-10T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.461188 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.461247 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.461263 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.461285 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.461305 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:45Z","lastTransitionTime":"2026-03-10T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.564288 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.564328 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.564337 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.564354 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.564363 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:45Z","lastTransitionTime":"2026-03-10T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.668132 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.668187 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.668197 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.668216 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.668228 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:45Z","lastTransitionTime":"2026-03-10T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.771557 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.771611 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.771620 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.771668 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.771678 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:45Z","lastTransitionTime":"2026-03-10T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.875181 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.875250 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.875267 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.875294 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.875308 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:45Z","lastTransitionTime":"2026-03-10T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.977953 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.977993 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.978004 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.978019 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:45 crc kubenswrapper[4906]: I0310 00:07:45.978028 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:45Z","lastTransitionTime":"2026-03-10T00:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.081351 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.081665 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.081766 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.081886 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.081995 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:46Z","lastTransitionTime":"2026-03-10T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.185695 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.185754 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.185772 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.185799 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.185817 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:46Z","lastTransitionTime":"2026-03-10T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.288035 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.288124 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.288149 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.288172 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.288229 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:46Z","lastTransitionTime":"2026-03-10T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.390890 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.390928 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.390937 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.390952 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.390963 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:46Z","lastTransitionTime":"2026-03-10T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.494023 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.494061 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.494070 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.494085 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.494098 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:46Z","lastTransitionTime":"2026-03-10T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.576491 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.576592 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.576598 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:46 crc kubenswrapper[4906]: E0310 00:07:46.576786 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:46 crc kubenswrapper[4906]: E0310 00:07:46.576888 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:46 crc kubenswrapper[4906]: E0310 00:07:46.577055 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.592300 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.596202 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.596259 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.596279 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.596305 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.596326 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:46Z","lastTransitionTime":"2026-03-10T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.699201 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.699941 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.700049 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.700190 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.700286 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:46Z","lastTransitionTime":"2026-03-10T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.803682 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.803764 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.803787 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.803818 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.803836 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:46Z","lastTransitionTime":"2026-03-10T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.906223 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.906278 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.906296 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.906322 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:46 crc kubenswrapper[4906]: I0310 00:07:46.906340 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:46Z","lastTransitionTime":"2026-03-10T00:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.008416 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.008470 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.008482 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.008502 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.008825 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:47Z","lastTransitionTime":"2026-03-10T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.111702 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.111737 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.111750 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.111768 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.111781 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:47Z","lastTransitionTime":"2026-03-10T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.214249 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.214307 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.214323 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.214351 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.214367 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:47Z","lastTransitionTime":"2026-03-10T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.316225 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.316289 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.316301 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.316319 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.316330 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:47Z","lastTransitionTime":"2026-03-10T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.419569 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.419615 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.419626 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.419661 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.419675 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:47Z","lastTransitionTime":"2026-03-10T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.522799 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.522862 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.522882 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.522908 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.522930 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:47Z","lastTransitionTime":"2026-03-10T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.625591 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.625654 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.625668 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.625688 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.625705 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:47Z","lastTransitionTime":"2026-03-10T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.728496 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.728543 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.728553 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.728572 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.728583 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:47Z","lastTransitionTime":"2026-03-10T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.831469 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.831545 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.831557 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.831580 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.831593 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:47Z","lastTransitionTime":"2026-03-10T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.934962 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.935015 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.935024 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.935043 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:47 crc kubenswrapper[4906]: I0310 00:07:47.935055 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:47Z","lastTransitionTime":"2026-03-10T00:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.038214 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.038261 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.038271 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.038290 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.038302 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.140116 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.140173 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.140186 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.140206 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.140217 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.242980 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.243027 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.243038 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.243056 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.243068 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.345886 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.345959 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.345980 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.346011 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.346033 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.449256 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.449560 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.449692 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.449807 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.449900 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.553186 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.553231 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.553244 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.553268 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.553284 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.576673 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.576741 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:48 crc kubenswrapper[4906]: E0310 00:07:48.576890 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.576913 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:48 crc kubenswrapper[4906]: E0310 00:07:48.577046 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:48 crc kubenswrapper[4906]: E0310 00:07:48.577189 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.656107 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.656163 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.656176 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.656196 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.656210 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.758892 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.758940 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.758956 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.758979 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.758992 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.861904 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.861942 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.861953 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.861969 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.861979 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.965094 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.965135 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.965149 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.965172 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:48 crc kubenswrapper[4906]: I0310 00:07:48.965184 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:48Z","lastTransitionTime":"2026-03-10T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.067981 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.068040 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.068059 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.068086 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.068105 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.174780 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.175073 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.175140 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.175206 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.175261 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.278440 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.278807 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.278907 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.279012 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.279100 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.381678 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.381717 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.381727 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.381744 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.381753 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.486275 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.486313 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.486322 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.486339 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.486349 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.588627 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.588720 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.588740 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.588766 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.588782 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.690971 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.691019 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.691029 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.691048 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.691062 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.793808 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.794385 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.794470 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.794545 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.794616 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.897452 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.897505 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.897518 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.897539 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:49 crc kubenswrapper[4906]: I0310 00:07:49.897552 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:49Z","lastTransitionTime":"2026-03-10T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.000089 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.000131 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.000144 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.000162 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.000176 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.103758 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.103816 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.103833 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.103857 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.103872 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.206625 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.206708 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.206724 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.206748 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.206768 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.309473 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.309524 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.309534 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.309557 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.309570 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.412303 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.412368 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.412385 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.412412 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.412429 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.514817 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.514864 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.514874 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.514893 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.514910 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.576361 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.576412 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.576449 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:50 crc kubenswrapper[4906]: E0310 00:07:50.576557 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:50 crc kubenswrapper[4906]: E0310 00:07:50.576676 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:50 crc kubenswrapper[4906]: E0310 00:07:50.576839 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.618061 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.618107 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.618117 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.618136 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.618147 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.720964 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.721015 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.721025 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.721047 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.721057 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.823606 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.823662 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.823673 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.823691 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.823704 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.926816 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.926856 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.926864 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.926881 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:50 crc kubenswrapper[4906]: I0310 00:07:50.926891 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:50Z","lastTransitionTime":"2026-03-10T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.028924 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.028972 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.028985 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.029003 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.029015 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.131674 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.131979 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.132059 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.132137 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.132207 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.234750 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.234793 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.234804 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.234821 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.234832 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.337125 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.337194 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.337206 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.337232 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.337246 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.439797 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.440297 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.440449 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.440549 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.440651 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.542797 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.542833 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.542841 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.542857 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.542867 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.645538 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.645585 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.645596 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.645614 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.645623 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.747859 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.747914 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.747924 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.747944 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.747955 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.850715 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.850756 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.850765 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.850782 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.850793 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.953888 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.953946 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.953964 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.953990 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:51 crc kubenswrapper[4906]: I0310 00:07:51.954007 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:51Z","lastTransitionTime":"2026-03-10T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.056114 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.056153 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.056170 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.056190 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.056202 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.158874 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.158958 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.158978 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.159004 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.159021 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.261624 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.261836 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.261899 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.261995 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.262098 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.365254 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.365303 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.365321 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.365350 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.365373 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.468733 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.468800 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.468818 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.468847 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.468867 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.571733 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.571778 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.571787 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.571804 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.571815 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.575998 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.576023 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.575998 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:52 crc kubenswrapper[4906]: E0310 00:07:52.576129 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:52 crc kubenswrapper[4906]: E0310 00:07:52.576205 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:52 crc kubenswrapper[4906]: E0310 00:07:52.576274 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.674865 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.675213 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.675335 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.675451 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.675530 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.779271 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.779668 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.779740 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.779812 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.779876 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.882401 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.882486 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.882513 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.882546 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.882567 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.984163 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.984203 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.984212 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.984228 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:52 crc kubenswrapper[4906]: I0310 00:07:52.984239 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:52Z","lastTransitionTime":"2026-03-10T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.087459 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.087502 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.087515 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.087571 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.087585 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.190380 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.190443 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.190460 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.190486 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.190503 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.293580 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.293630 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.293655 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.293675 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.293685 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.396821 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.396885 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.396897 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.396914 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.396924 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.499714 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.499795 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.499810 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.499833 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.499847 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.602296 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.602344 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.602354 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.602373 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.602382 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.704261 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.704322 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.704341 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.704367 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.704386 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.807770 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.807810 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.807821 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.807841 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.807853 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.910871 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.910913 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.910922 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.910944 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:53 crc kubenswrapper[4906]: I0310 00:07:53.910954 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:53Z","lastTransitionTime":"2026-03-10T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.013062 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.013131 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.013150 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.013175 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.013192 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.117276 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.117346 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.117369 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.117409 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.117431 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.220281 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.220350 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.220367 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.220395 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.220417 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.322939 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.322987 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.322996 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.323018 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.323028 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.425872 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.425935 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.425954 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.425979 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.425992 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.528692 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.528727 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.528738 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.528754 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.528765 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.575856 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:54 crc kubenswrapper[4906]: E0310 00:07:54.575999 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.576150 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:54 crc kubenswrapper[4906]: E0310 00:07:54.576359 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.576384 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:54 crc kubenswrapper[4906]: E0310 00:07:54.576506 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.590117 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.607911 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.621292 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.631121 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.631183 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.631211 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.631237 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.631253 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.638069 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.655137 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.673113 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.693228 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.712472 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.733588 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.733665 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.733677 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.733699 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.733712 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.837081 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.837140 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.837158 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.837183 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.837200 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.940606 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.940677 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.940689 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.940709 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:54 crc kubenswrapper[4906]: I0310 00:07:54.940722 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:54Z","lastTransitionTime":"2026-03-10T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.043111 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.043157 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.043171 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.043189 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.043202 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.114483 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.114541 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.114558 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.114588 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.114607 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4906]: E0310 00:07:55.129662 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.134666 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.134721 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.134733 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.134756 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.134770 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4906]: E0310 00:07:55.152206 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.156412 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.156450 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.156461 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.156479 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.156491 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4906]: E0310 00:07:55.176192 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.181500 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.181540 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.181556 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.181577 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.181590 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4906]: E0310 00:07:55.201112 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.209567 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.209830 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.209976 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.210129 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.210276 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4906]: E0310 00:07:55.234750 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:55 crc kubenswrapper[4906]: E0310 00:07:55.234942 4906 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.237397 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.237444 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.237457 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.237475 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.237489 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.339524 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.339596 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.339620 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.339690 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.339734 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.442291 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.442357 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.442374 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.442399 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.442418 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.544894 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.544945 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.544961 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.544987 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.545004 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.648194 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.648247 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.648263 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.648289 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.648308 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.751366 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.751418 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.751434 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.751458 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.751474 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.854626 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.854744 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.854762 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.854791 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.854809 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.957745 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.957793 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.957807 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.957827 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:55 crc kubenswrapper[4906]: I0310 00:07:55.957841 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:55Z","lastTransitionTime":"2026-03-10T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.060474 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.060546 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.060573 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.060607 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.060665 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.163857 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.163914 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.163932 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.163957 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.163973 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.267259 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.267325 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.267345 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.267374 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.267394 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.370800 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.370876 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.370890 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.370910 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.370923 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.413220 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:07:56 crc kubenswrapper[4906]: E0310 00:07:56.413603 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:08:28.413557817 +0000 UTC m=+134.561452969 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.473779 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.473846 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.473863 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.473892 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.473913 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.514622 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.514754 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.514807 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:56 crc kubenswrapper[4906]: E0310 00:07:56.514849 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.514864 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:56 crc kubenswrapper[4906]: E0310 00:07:56.514888 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:56 crc kubenswrapper[4906]: E0310 00:07:56.514907 4906 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:56 crc kubenswrapper[4906]: E0310 00:07:56.514947 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:07:56 crc kubenswrapper[4906]: E0310 00:07:56.514979 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:07:56 crc kubenswrapper[4906]: E0310 00:07:56.514994 4906 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:56 crc kubenswrapper[4906]: E0310 00:07:56.515015 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:28.514962907 +0000 UTC m=+134.662858059 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:56 crc kubenswrapper[4906]: E0310 00:07:56.515021 4906 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:56 crc kubenswrapper[4906]: E0310 00:07:56.515062 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:28.515040979 +0000 UTC m=+134.662936091 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:07:56 crc kubenswrapper[4906]: E0310 00:07:56.515089 4906 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:56 crc kubenswrapper[4906]: E0310 00:07:56.515122 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:28.51509219 +0000 UTC m=+134.662987352 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:07:56 crc kubenswrapper[4906]: E0310 00:07:56.515186 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:28.515158432 +0000 UTC m=+134.663053594 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.575679 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.575703 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.575886 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:56 crc kubenswrapper[4906]: E0310 00:07:56.575981 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:56 crc kubenswrapper[4906]: E0310 00:07:56.576152 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:56 crc kubenswrapper[4906]: E0310 00:07:56.576307 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.576885 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.576918 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.576927 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.576945 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.576955 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.679985 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.680040 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.680057 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.680084 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.680102 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.782935 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.782994 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.783010 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.783041 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.783059 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.886745 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.886800 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.886817 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.886844 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.886863 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.990686 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.990757 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.990777 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.990804 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:56 crc kubenswrapper[4906]: I0310 00:07:56.990822 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:56Z","lastTransitionTime":"2026-03-10T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.093724 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.093783 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.093802 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.093826 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.093845 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.197080 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.197163 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.197188 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.197222 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.197245 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.300069 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.300107 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.300118 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.300139 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.300149 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.403298 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.403356 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.403373 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.403404 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.403422 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.506231 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.506296 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.506313 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.506337 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.506354 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.609389 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.609454 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.609473 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.609502 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.609519 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.712686 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.712746 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.712763 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.712796 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.712812 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.816242 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.816315 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.816332 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.816360 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.816378 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.919396 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.919463 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.919480 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.919507 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:57 crc kubenswrapper[4906]: I0310 00:07:57.919524 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:57Z","lastTransitionTime":"2026-03-10T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.021951 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.021987 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.021999 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.022016 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.022028 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.124843 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.124887 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.124898 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.124914 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.124922 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.227998 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.228049 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.228062 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.228087 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.228098 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.330255 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.330324 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.330336 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.330356 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.330370 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.432660 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.432713 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.432725 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.432747 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.432761 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.535254 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.535297 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.535309 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.535327 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.535338 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.576180 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.576192 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:07:58 crc kubenswrapper[4906]: E0310 00:07:58.576324 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.576413 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:07:58 crc kubenswrapper[4906]: E0310 00:07:58.576866 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:07:58 crc kubenswrapper[4906]: E0310 00:07:58.576920 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.592226 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.638443 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.638534 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.638544 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.638564 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.638574 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.741302 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.741350 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.741361 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.741379 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.741389 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.843362 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.843424 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.843440 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.843460 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.843474 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.947707 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.947794 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.947820 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.947862 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:58 crc kubenswrapper[4906]: I0310 00:07:58.947888 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:58Z","lastTransitionTime":"2026-03-10T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.052344 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.052409 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.052430 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.052457 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.052475 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.155401 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.155450 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.155459 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.155477 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.155486 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.258371 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.258425 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.258434 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.258454 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.258463 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.360686 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.360737 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.360745 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.360770 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.360783 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.463964 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.464016 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.464027 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.464049 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.464062 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.567524 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.567561 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.567569 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.567592 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.567603 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.671086 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.671129 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.671138 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.671153 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.671162 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.773584 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.773615 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.773623 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.773657 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.773666 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.810795 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9bd57"] Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.811160 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9bd57" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.812911 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.813375 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.814480 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.825852 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.842983 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.854241 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.869064 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.876019 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.876047 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.876061 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.876081 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.876093 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.880784 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.889866 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.906250 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.915871 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.929358 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.941338 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.944661 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9a677000-e7dd-40fa-ad3e-94c061293e29-hosts-file\") pod \"node-resolver-9bd57\" (UID: \"9a677000-e7dd-40fa-ad3e-94c061293e29\") " pod="openshift-dns/node-resolver-9bd57" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.944730 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hdfw\" (UniqueName: \"kubernetes.io/projected/9a677000-e7dd-40fa-ad3e-94c061293e29-kube-api-access-6hdfw\") pod \"node-resolver-9bd57\" (UID: \"9a677000-e7dd-40fa-ad3e-94c061293e29\") " pod="openshift-dns/node-resolver-9bd57" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.978811 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.978867 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.978877 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.978895 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:07:59 crc kubenswrapper[4906]: I0310 00:07:59.978905 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:07:59Z","lastTransitionTime":"2026-03-10T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.046115 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hdfw\" (UniqueName: \"kubernetes.io/projected/9a677000-e7dd-40fa-ad3e-94c061293e29-kube-api-access-6hdfw\") pod \"node-resolver-9bd57\" (UID: \"9a677000-e7dd-40fa-ad3e-94c061293e29\") " pod="openshift-dns/node-resolver-9bd57" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.046159 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9a677000-e7dd-40fa-ad3e-94c061293e29-hosts-file\") pod \"node-resolver-9bd57\" (UID: \"9a677000-e7dd-40fa-ad3e-94c061293e29\") " pod="openshift-dns/node-resolver-9bd57" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.046246 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9a677000-e7dd-40fa-ad3e-94c061293e29-hosts-file\") pod \"node-resolver-9bd57\" (UID: \"9a677000-e7dd-40fa-ad3e-94c061293e29\") " pod="openshift-dns/node-resolver-9bd57" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.064440 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hdfw\" (UniqueName: \"kubernetes.io/projected/9a677000-e7dd-40fa-ad3e-94c061293e29-kube-api-access-6hdfw\") pod \"node-resolver-9bd57\" (UID: \"9a677000-e7dd-40fa-ad3e-94c061293e29\") " pod="openshift-dns/node-resolver-9bd57" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.081251 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.081288 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.081298 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.081315 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.081327 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.129007 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9bd57" Mar 10 00:08:00 crc kubenswrapper[4906]: W0310 00:08:00.141747 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a677000_e7dd_40fa_ad3e_94c061293e29.slice/crio-d3d9dfd9d37061f70c1a1aa2773f3b3b7b00468d6032bc87de5ec7c5b98acc96 WatchSource:0}: Error finding container d3d9dfd9d37061f70c1a1aa2773f3b3b7b00468d6032bc87de5ec7c5b98acc96: Status 404 returned error can't find the container with id d3d9dfd9d37061f70c1a1aa2773f3b3b7b00468d6032bc87de5ec7c5b98acc96 Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.173062 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-jgfpc"] Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.173992 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-85dv2"] Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.174143 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-bxtw4"] Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.174209 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.174432 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.174483 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.177768 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.177781 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.178297 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.178446 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.178514 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.178579 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.178623 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.178716 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.178745 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.178828 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.179008 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.179401 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.184532 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.184569 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.184581 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.184599 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.184611 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.208155 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.220073 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.230658 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.246473 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.257627 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.274829 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.285250 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.287892 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.287928 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.287938 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.287956 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.287965 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.299327 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.311277 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.322219 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.335713 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.347761 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e902b2c-8bf9-49e8-9820-392f34dbfb10-os-release\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.347806 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-os-release\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.347838 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e902b2c-8bf9-49e8-9820-392f34dbfb10-system-cni-dir\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.347865 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c494c18-0d46-4e23-8ef5-214938a66a7b-cni-binary-copy\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.347889 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-multus-cni-dir\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.347935 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e902b2c-8bf9-49e8-9820-392f34dbfb10-cni-binary-copy\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.347958 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/72d61d35-0a64-45a5-8df3-9c429727deba-mcd-auth-proxy-config\") pod \"machine-config-daemon-bxtw4\" (UID: \"72d61d35-0a64-45a5-8df3-9c429727deba\") " pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.347978 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e902b2c-8bf9-49e8-9820-392f34dbfb10-cnibin\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.347996 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-multus-conf-dir\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348055 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-multus-socket-dir-parent\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348079 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftt77\" (UniqueName: \"kubernetes.io/projected/0c494c18-0d46-4e23-8ef5-214938a66a7b-kube-api-access-ftt77\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348101 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-run-multus-certs\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348120 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e902b2c-8bf9-49e8-9820-392f34dbfb10-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348141 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-etc-kubernetes\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348163 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ccf9\" (UniqueName: \"kubernetes.io/projected/72d61d35-0a64-45a5-8df3-9c429727deba-kube-api-access-5ccf9\") pod \"machine-config-daemon-bxtw4\" (UID: \"72d61d35-0a64-45a5-8df3-9c429727deba\") " pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348186 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72d61d35-0a64-45a5-8df3-9c429727deba-proxy-tls\") pod \"machine-config-daemon-bxtw4\" (UID: \"72d61d35-0a64-45a5-8df3-9c429727deba\") " pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348208 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-var-lib-cni-multus\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348227 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrdxj\" (UniqueName: \"kubernetes.io/projected/7e902b2c-8bf9-49e8-9820-392f34dbfb10-kube-api-access-hrdxj\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348249 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/72d61d35-0a64-45a5-8df3-9c429727deba-rootfs\") pod \"machine-config-daemon-bxtw4\" (UID: \"72d61d35-0a64-45a5-8df3-9c429727deba\") " pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348288 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-var-lib-kubelet\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348358 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-var-lib-cni-bin\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348454 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-run-netns\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348482 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-hostroot\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348503 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0c494c18-0d46-4e23-8ef5-214938a66a7b-multus-daemon-config\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348542 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-cnibin\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348565 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-run-k8s-cni-cncf-io\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348587 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7e902b2c-8bf9-49e8-9820-392f34dbfb10-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.348605 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-system-cni-dir\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.351786 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.366810 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.382054 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.393476 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.393533 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.393547 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.393569 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.393582 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.394582 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.407501 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.419354 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.433090 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.443936 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449207 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-var-lib-kubelet\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449235 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-run-netns\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449250 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-var-lib-cni-bin\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449267 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-cnibin\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449284 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-run-k8s-cni-cncf-io\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449297 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-hostroot\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449310 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0c494c18-0d46-4e23-8ef5-214938a66a7b-multus-daemon-config\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449325 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7e902b2c-8bf9-49e8-9820-392f34dbfb10-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449340 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-system-cni-dir\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449354 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e902b2c-8bf9-49e8-9820-392f34dbfb10-os-release\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449368 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-os-release\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449385 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e902b2c-8bf9-49e8-9820-392f34dbfb10-system-cni-dir\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449402 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c494c18-0d46-4e23-8ef5-214938a66a7b-cni-binary-copy\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449431 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-multus-cni-dir\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449447 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e902b2c-8bf9-49e8-9820-392f34dbfb10-cnibin\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449463 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e902b2c-8bf9-49e8-9820-392f34dbfb10-cni-binary-copy\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449478 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/72d61d35-0a64-45a5-8df3-9c429727deba-mcd-auth-proxy-config\") pod \"machine-config-daemon-bxtw4\" (UID: \"72d61d35-0a64-45a5-8df3-9c429727deba\") " pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449496 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-multus-socket-dir-parent\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449512 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-multus-conf-dir\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449528 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-run-multus-certs\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449546 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftt77\" (UniqueName: \"kubernetes.io/projected/0c494c18-0d46-4e23-8ef5-214938a66a7b-kube-api-access-ftt77\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449580 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e902b2c-8bf9-49e8-9820-392f34dbfb10-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449596 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ccf9\" (UniqueName: \"kubernetes.io/projected/72d61d35-0a64-45a5-8df3-9c429727deba-kube-api-access-5ccf9\") pod \"machine-config-daemon-bxtw4\" (UID: \"72d61d35-0a64-45a5-8df3-9c429727deba\") " pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449612 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-etc-kubernetes\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449626 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-var-lib-cni-multus\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449666 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72d61d35-0a64-45a5-8df3-9c429727deba-proxy-tls\") pod \"machine-config-daemon-bxtw4\" (UID: \"72d61d35-0a64-45a5-8df3-9c429727deba\") " pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449692 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrdxj\" (UniqueName: \"kubernetes.io/projected/7e902b2c-8bf9-49e8-9820-392f34dbfb10-kube-api-access-hrdxj\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449707 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/72d61d35-0a64-45a5-8df3-9c429727deba-rootfs\") pod \"machine-config-daemon-bxtw4\" (UID: \"72d61d35-0a64-45a5-8df3-9c429727deba\") " pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449808 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/72d61d35-0a64-45a5-8df3-9c429727deba-rootfs\") pod \"machine-config-daemon-bxtw4\" (UID: \"72d61d35-0a64-45a5-8df3-9c429727deba\") " pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.449985 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-hostroot\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.450004 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e902b2c-8bf9-49e8-9820-392f34dbfb10-cnibin\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.450154 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-multus-cni-dir\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.450575 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0c494c18-0d46-4e23-8ef5-214938a66a7b-multus-daemon-config\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.450598 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e902b2c-8bf9-49e8-9820-392f34dbfb10-cni-binary-copy\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.453561 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-run-netns\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.453792 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-etc-kubernetes\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.453913 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-multus-conf-dir\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.454821 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-cnibin\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.454871 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-run-k8s-cni-cncf-io\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.454899 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-system-cni-dir\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.454925 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-var-lib-cni-multus\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.454953 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e902b2c-8bf9-49e8-9820-392f34dbfb10-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.454977 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e902b2c-8bf9-49e8-9820-392f34dbfb10-os-release\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.455737 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-var-lib-cni-bin\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.455790 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-multus-socket-dir-parent\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.455922 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e902b2c-8bf9-49e8-9820-392f34dbfb10-system-cni-dir\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.455988 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-os-release\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.456019 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-run-multus-certs\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.456077 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.456224 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/72d61d35-0a64-45a5-8df3-9c429727deba-mcd-auth-proxy-config\") pod \"machine-config-daemon-bxtw4\" (UID: \"72d61d35-0a64-45a5-8df3-9c429727deba\") " pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.456293 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c494c18-0d46-4e23-8ef5-214938a66a7b-cni-binary-copy\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.456426 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c494c18-0d46-4e23-8ef5-214938a66a7b-host-var-lib-kubelet\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.459520 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7e902b2c-8bf9-49e8-9820-392f34dbfb10-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.462946 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72d61d35-0a64-45a5-8df3-9c429727deba-proxy-tls\") pod \"machine-config-daemon-bxtw4\" (UID: \"72d61d35-0a64-45a5-8df3-9c429727deba\") " pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.469152 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrdxj\" (UniqueName: \"kubernetes.io/projected/7e902b2c-8bf9-49e8-9820-392f34dbfb10-kube-api-access-hrdxj\") pod \"multus-additional-cni-plugins-jgfpc\" (UID: \"7e902b2c-8bf9-49e8-9820-392f34dbfb10\") " pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.471418 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.474403 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ccf9\" (UniqueName: \"kubernetes.io/projected/72d61d35-0a64-45a5-8df3-9c429727deba-kube-api-access-5ccf9\") pod \"machine-config-daemon-bxtw4\" (UID: \"72d61d35-0a64-45a5-8df3-9c429727deba\") " pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.475081 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftt77\" (UniqueName: \"kubernetes.io/projected/0c494c18-0d46-4e23-8ef5-214938a66a7b-kube-api-access-ftt77\") pod \"multus-85dv2\" (UID: \"0c494c18-0d46-4e23-8ef5-214938a66a7b\") " pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.485136 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.490300 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.496244 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.496273 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.496283 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.496298 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.496307 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.501328 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:08:00 crc kubenswrapper[4906]: W0310 00:08:00.502816 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e902b2c_8bf9_49e8_9820_392f34dbfb10.slice/crio-49321f7413911c01f12f8d5cd3ec03ebeb07b6abdafdf8555ccc41ac94499b41 WatchSource:0}: Error finding container 49321f7413911c01f12f8d5cd3ec03ebeb07b6abdafdf8555ccc41ac94499b41: Status 404 returned error can't find the container with id 49321f7413911c01f12f8d5cd3ec03ebeb07b6abdafdf8555ccc41ac94499b41 Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.505561 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-85dv2" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.509037 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: W0310 00:08:00.515108 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72d61d35_0a64_45a5_8df3_9c429727deba.slice/crio-a005430960ee78384fa772e13c10a717d34796b325eb4f3171f85e293b80995a WatchSource:0}: Error finding container a005430960ee78384fa772e13c10a717d34796b325eb4f3171f85e293b80995a: Status 404 returned error can't find the container with id a005430960ee78384fa772e13c10a717d34796b325eb4f3171f85e293b80995a Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.521027 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: W0310 00:08:00.527668 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c494c18_0d46_4e23_8ef5_214938a66a7b.slice/crio-5e4088d1635654c4160e84421c1622b04e54bf9746b1476f632af3a920d5ab67 WatchSource:0}: Error finding container 5e4088d1635654c4160e84421c1622b04e54bf9746b1476f632af3a920d5ab67: Status 404 returned error can't find the container with id 5e4088d1635654c4160e84421c1622b04e54bf9746b1476f632af3a920d5ab67 Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.536569 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hskrb"] Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.537682 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.540221 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.540333 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.541659 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.542083 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.542122 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.542138 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.542352 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.563351 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.576630 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.576682 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.576672 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: E0310 00:08:00.576795 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.576861 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:00 crc kubenswrapper[4906]: E0310 00:08:00.576973 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:00 crc kubenswrapper[4906]: E0310 00:08:00.577045 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.589446 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.598510 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.598580 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.598595 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.598614 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.598911 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.603081 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.616998 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.630134 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.641907 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.651271 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-kubelet\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.651326 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-systemd\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.651356 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-run-ovn-kubernetes\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.651382 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-log-socket\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.651404 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-cni-bin\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.651426 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovnkube-config\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.651573 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbtkp\" (UniqueName: \"kubernetes.io/projected/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-kube-api-access-bbtkp\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.651690 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-var-lib-openvswitch\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.651716 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-ovn\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.651746 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovn-node-metrics-cert\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.651772 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovnkube-script-lib\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.651811 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.651843 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-etc-openvswitch\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.651903 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-run-netns\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.651935 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-cni-netd\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.651967 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-env-overrides\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.652060 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-systemd-units\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.652113 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-slash\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.652149 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-openvswitch\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.652185 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-node-log\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.659169 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.671990 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.685100 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.698833 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.706386 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.706423 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.706436 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.706487 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.706502 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.708544 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.734735 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.746427 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.753715 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-log-socket\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.753771 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-cni-bin\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.753797 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovnkube-config\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.753827 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbtkp\" (UniqueName: \"kubernetes.io/projected/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-kube-api-access-bbtkp\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.753852 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-var-lib-openvswitch\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.753872 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-ovn\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.753907 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovn-node-metrics-cert\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.753932 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovnkube-script-lib\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.753966 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.753992 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-etc-openvswitch\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.754013 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-run-netns\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.754034 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-cni-netd\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.754058 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-env-overrides\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.754082 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-systemd-units\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.754103 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-slash\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.754125 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-openvswitch\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.754152 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-node-log\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.754198 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-kubelet\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.754223 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-systemd\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.754294 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-run-ovn-kubernetes\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.754390 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-run-ovn-kubernetes\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.754447 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-log-socket\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.754481 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-cni-bin\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.754651 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-run-netns\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.755158 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.755288 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-slash\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.755318 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovnkube-script-lib\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.755381 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-cni-netd\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.755410 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-etc-openvswitch\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.755429 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-systemd-units\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.755476 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-node-log\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.755513 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-openvswitch\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.755571 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-kubelet\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.755666 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-ovn\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.755705 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-env-overrides\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.755749 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-systemd\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.756071 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-var-lib-openvswitch\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.756275 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovnkube-config\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.760321 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovn-node-metrics-cert\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.773006 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbtkp\" (UniqueName: \"kubernetes.io/projected/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-kube-api-access-bbtkp\") pod \"ovnkube-node-hskrb\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.809201 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.809494 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.809559 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.809625 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.809719 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.858674 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:00 crc kubenswrapper[4906]: W0310 00:08:00.869602 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9f87520_6105_4b6f_ba5a_a232b5dc24c0.slice/crio-558cd2075ee300f87f012fb60f7aafe0af05a9b68f56a72922916b36cd40d060 WatchSource:0}: Error finding container 558cd2075ee300f87f012fb60f7aafe0af05a9b68f56a72922916b36cd40d060: Status 404 returned error can't find the container with id 558cd2075ee300f87f012fb60f7aafe0af05a9b68f56a72922916b36cd40d060 Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.912259 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.912308 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.912320 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.912343 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:00 crc kubenswrapper[4906]: I0310 00:08:00.912356 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:00Z","lastTransitionTime":"2026-03-10T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.014690 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.014731 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.014740 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.014755 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.014765 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.017072 4906 generic.go:334] "Generic (PLEG): container finished" podID="7e902b2c-8bf9-49e8-9820-392f34dbfb10" containerID="461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e" exitCode=0 Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.017139 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" event={"ID":"7e902b2c-8bf9-49e8-9820-392f34dbfb10","Type":"ContainerDied","Data":"461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.017173 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" event={"ID":"7e902b2c-8bf9-49e8-9820-392f34dbfb10","Type":"ContainerStarted","Data":"49321f7413911c01f12f8d5cd3ec03ebeb07b6abdafdf8555ccc41ac94499b41"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.019328 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9bd57" event={"ID":"9a677000-e7dd-40fa-ad3e-94c061293e29","Type":"ContainerStarted","Data":"4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.019387 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9bd57" event={"ID":"9a677000-e7dd-40fa-ad3e-94c061293e29","Type":"ContainerStarted","Data":"d3d9dfd9d37061f70c1a1aa2773f3b3b7b00468d6032bc87de5ec7c5b98acc96"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.021304 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-85dv2" event={"ID":"0c494c18-0d46-4e23-8ef5-214938a66a7b","Type":"ContainerStarted","Data":"7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.021329 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-85dv2" event={"ID":"0c494c18-0d46-4e23-8ef5-214938a66a7b","Type":"ContainerStarted","Data":"5e4088d1635654c4160e84421c1622b04e54bf9746b1476f632af3a920d5ab67"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.025491 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerStarted","Data":"da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.025554 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerStarted","Data":"f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.025567 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerStarted","Data":"a005430960ee78384fa772e13c10a717d34796b325eb4f3171f85e293b80995a"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.030030 4906 generic.go:334] "Generic (PLEG): container finished" podID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerID="52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80" exitCode=0 Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.030067 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerDied","Data":"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.030098 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerStarted","Data":"558cd2075ee300f87f012fb60f7aafe0af05a9b68f56a72922916b36cd40d060"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.041829 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.056909 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.071626 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.086426 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.105215 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.117385 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.117418 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.117427 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.117443 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.117452 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.119021 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.134487 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.147671 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.160569 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.171740 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.184832 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.200117 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.212867 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.220219 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.220252 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.220261 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.220278 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.220288 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.240026 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.254994 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.268806 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.281828 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.303466 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.317493 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.322542 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.322583 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.322599 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.322617 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.322631 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.335716 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.365853 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.379442 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.392836 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.405930 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.418862 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.425040 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.425076 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.425085 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.425105 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.425115 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.430512 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.448452 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.465123 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.527380 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.527417 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.527428 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.527446 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.527457 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.630881 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.630926 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.630943 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.630961 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.630971 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.734993 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.735426 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.735440 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.735458 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.735470 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.837845 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.837875 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.837883 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.837898 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.837909 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.940859 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.940893 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.940901 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.940917 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:01 crc kubenswrapper[4906]: I0310 00:08:01.940927 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:01Z","lastTransitionTime":"2026-03-10T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.038383 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerStarted","Data":"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3"} Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.038466 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerStarted","Data":"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6"} Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.038480 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerStarted","Data":"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be"} Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.038491 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerStarted","Data":"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a"} Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.040084 4906 generic.go:334] "Generic (PLEG): container finished" podID="7e902b2c-8bf9-49e8-9820-392f34dbfb10" containerID="2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685" exitCode=0 Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.040791 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" event={"ID":"7e902b2c-8bf9-49e8-9820-392f34dbfb10","Type":"ContainerDied","Data":"2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685"} Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.042109 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.042126 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.042134 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.042145 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.042153 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.056334 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.074925 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.087574 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.104551 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.116841 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.144755 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.147162 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.147210 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.147226 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.147253 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.147274 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.163225 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.184390 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.206600 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.219809 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.233798 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.250709 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.251512 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.251563 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.251578 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.251600 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.251615 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.263582 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.276519 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.354498 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.354540 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.354551 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.354572 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.354583 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.457413 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.457506 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.457521 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.457545 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.457560 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.560568 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.560627 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.560672 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.560693 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.560706 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.576311 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.576480 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.576712 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:02 crc kubenswrapper[4906]: E0310 00:08:02.576704 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:02 crc kubenswrapper[4906]: E0310 00:08:02.576940 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:02 crc kubenswrapper[4906]: E0310 00:08:02.577049 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.663752 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.663819 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.663836 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.663867 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.663885 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.766519 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.766564 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.766573 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.766592 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.766604 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.870204 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.870243 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.870252 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.870274 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.870285 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.973779 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.973820 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.973828 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.973848 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:02 crc kubenswrapper[4906]: I0310 00:08:02.973860 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:02Z","lastTransitionTime":"2026-03-10T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.046248 4906 generic.go:334] "Generic (PLEG): container finished" podID="7e902b2c-8bf9-49e8-9820-392f34dbfb10" containerID="13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a" exitCode=0 Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.046328 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" event={"ID":"7e902b2c-8bf9-49e8-9820-392f34dbfb10","Type":"ContainerDied","Data":"13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a"} Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.051692 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerStarted","Data":"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21"} Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.051729 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerStarted","Data":"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878"} Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.063922 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.076830 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.076876 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.076892 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.076922 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.076961 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.078729 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.094652 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.110934 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.128617 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.143862 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.159007 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.179220 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.179258 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.179267 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.179286 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.179296 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.193145 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.204790 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.234945 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.251546 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.263691 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.277285 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.282029 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.282166 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.282344 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.282489 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.282571 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.294749 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.385433 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.385465 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.385474 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.385492 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.385506 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.487868 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.488147 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.488229 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.488313 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.488384 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.591249 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.591308 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.591330 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.591358 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.591377 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.694092 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.694152 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.694164 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.694189 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.694203 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.797713 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.798085 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.798278 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.798466 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.798583 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.901586 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.901688 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.901710 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.901739 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:03 crc kubenswrapper[4906]: I0310 00:08:03.901765 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:03Z","lastTransitionTime":"2026-03-10T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.005945 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.006024 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.006050 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.006087 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.006107 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.060246 4906 generic.go:334] "Generic (PLEG): container finished" podID="7e902b2c-8bf9-49e8-9820-392f34dbfb10" containerID="3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357" exitCode=0 Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.060317 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" event={"ID":"7e902b2c-8bf9-49e8-9820-392f34dbfb10","Type":"ContainerDied","Data":"3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357"} Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.080282 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.098468 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.109065 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.109139 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.109163 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.109201 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.109226 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.116536 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.139549 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.162281 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.181943 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.195410 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.209286 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.211686 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.211731 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.211750 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.211777 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.211795 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.224380 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.240442 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.256097 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.271220 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.284077 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.307573 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.316778 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.316822 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.316839 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.316865 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.316882 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.419395 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.419439 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.419451 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.419469 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.419481 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.522541 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.522607 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.522619 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.522649 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.522659 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.576170 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:04 crc kubenswrapper[4906]: E0310 00:08:04.576304 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.576347 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:04 crc kubenswrapper[4906]: E0310 00:08:04.576418 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.576482 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:04 crc kubenswrapper[4906]: E0310 00:08:04.576540 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.589694 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.601916 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.612140 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.624953 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.625073 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.625102 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.625114 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.625138 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.625150 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.636944 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.667890 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.721341 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.728138 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.728175 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.728186 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.728204 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.728215 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.735490 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.748543 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.761238 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.776650 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.791539 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.803888 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.813535 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.830056 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.830102 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.830118 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.830140 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.830158 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.931869 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.931901 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.931909 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.931924 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:04 crc kubenswrapper[4906]: I0310 00:08:04.931933 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:04Z","lastTransitionTime":"2026-03-10T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.034084 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.034157 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.034178 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.034207 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.034233 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.067198 4906 generic.go:334] "Generic (PLEG): container finished" podID="7e902b2c-8bf9-49e8-9820-392f34dbfb10" containerID="d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d" exitCode=0 Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.067326 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" event={"ID":"7e902b2c-8bf9-49e8-9820-392f34dbfb10","Type":"ContainerDied","Data":"d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d"} Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.074387 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerStarted","Data":"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7"} Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.081544 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.096234 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.108185 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.138394 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.138443 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.138459 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.138484 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.138501 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.139664 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.151865 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.182999 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.196731 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.207981 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.218422 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.233100 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.241179 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.241225 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.241239 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.241262 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.241276 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.252331 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.265115 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.276966 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.292169 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.344069 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.344107 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.344114 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.344130 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.344140 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.446681 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.446733 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.446743 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.446761 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.446770 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.549885 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.549965 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.549976 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.549997 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.550010 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.632129 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.632236 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.632255 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.632283 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.632305 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4906]: E0310 00:08:05.649583 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.654944 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.654986 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.654995 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.655012 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.655022 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4906]: E0310 00:08:05.673774 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.678414 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.678573 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.678613 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.678667 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.678685 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4906]: E0310 00:08:05.696182 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.700617 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.700715 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.700739 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.700791 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.700816 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4906]: E0310 00:08:05.717986 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.723091 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.723124 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.723132 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.723148 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.723159 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4906]: E0310 00:08:05.739608 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:05 crc kubenswrapper[4906]: E0310 00:08:05.739745 4906 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.741611 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.741703 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.741723 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.741751 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.741779 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.844777 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.844843 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.844881 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.844913 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.844931 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.948313 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.948390 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.948415 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.948442 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:05 crc kubenswrapper[4906]: I0310 00:08:05.948460 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:05Z","lastTransitionTime":"2026-03-10T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.051307 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.051377 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.051399 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.051433 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.051463 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.084668 4906 generic.go:334] "Generic (PLEG): container finished" podID="7e902b2c-8bf9-49e8-9820-392f34dbfb10" containerID="860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec" exitCode=0 Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.084754 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" event={"ID":"7e902b2c-8bf9-49e8-9820-392f34dbfb10","Type":"ContainerDied","Data":"860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec"} Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.106707 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.129608 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.144760 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.159051 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.159213 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.159235 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.159264 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.159285 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.172108 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.205521 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.226973 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.243295 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.257776 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.262893 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.262930 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.262941 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.262960 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.262972 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.276743 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.293558 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.307591 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.321885 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.334721 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.348444 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.365967 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.366008 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.366019 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.366039 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.366052 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.468685 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.468729 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.468740 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.468757 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.468768 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.571837 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.571921 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.571941 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.571970 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.571989 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.576226 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:06 crc kubenswrapper[4906]: E0310 00:08:06.576417 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.576481 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.576526 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:06 crc kubenswrapper[4906]: E0310 00:08:06.576666 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:06 crc kubenswrapper[4906]: E0310 00:08:06.576825 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.675375 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.675449 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.675466 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.675490 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.675511 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.778208 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.778250 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.778259 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.778277 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.778287 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.852353 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-vwftl"] Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.852792 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vwftl" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.855358 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.855570 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.855841 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.856859 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.873865 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.880049 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.880074 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.880081 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.880097 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.880105 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.888556 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.900657 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.912914 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c755e546-15de-4347-bc90-03a4ea362583-host\") pod \"node-ca-vwftl\" (UID: \"c755e546-15de-4347-bc90-03a4ea362583\") " pod="openshift-image-registry/node-ca-vwftl" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.913040 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c755e546-15de-4347-bc90-03a4ea362583-serviceca\") pod \"node-ca-vwftl\" (UID: \"c755e546-15de-4347-bc90-03a4ea362583\") " pod="openshift-image-registry/node-ca-vwftl" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.913092 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj5d7\" (UniqueName: \"kubernetes.io/projected/c755e546-15de-4347-bc90-03a4ea362583-kube-api-access-gj5d7\") pod \"node-ca-vwftl\" (UID: \"c755e546-15de-4347-bc90-03a4ea362583\") " pod="openshift-image-registry/node-ca-vwftl" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.915535 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.934568 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.946303 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.961923 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.973135 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.982224 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.982315 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.982375 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.982437 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.982490 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:06Z","lastTransitionTime":"2026-03-10T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:06 crc kubenswrapper[4906]: I0310 00:08:06.993497 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.008342 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.014276 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c755e546-15de-4347-bc90-03a4ea362583-serviceca\") pod \"node-ca-vwftl\" (UID: \"c755e546-15de-4347-bc90-03a4ea362583\") " pod="openshift-image-registry/node-ca-vwftl" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.014331 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj5d7\" (UniqueName: \"kubernetes.io/projected/c755e546-15de-4347-bc90-03a4ea362583-kube-api-access-gj5d7\") pod \"node-ca-vwftl\" (UID: \"c755e546-15de-4347-bc90-03a4ea362583\") " pod="openshift-image-registry/node-ca-vwftl" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.014371 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c755e546-15de-4347-bc90-03a4ea362583-host\") pod \"node-ca-vwftl\" (UID: \"c755e546-15de-4347-bc90-03a4ea362583\") " pod="openshift-image-registry/node-ca-vwftl" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.014428 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c755e546-15de-4347-bc90-03a4ea362583-host\") pod \"node-ca-vwftl\" (UID: \"c755e546-15de-4347-bc90-03a4ea362583\") " pod="openshift-image-registry/node-ca-vwftl" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.015396 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c755e546-15de-4347-bc90-03a4ea362583-serviceca\") pod \"node-ca-vwftl\" (UID: \"c755e546-15de-4347-bc90-03a4ea362583\") " pod="openshift-image-registry/node-ca-vwftl" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.020008 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.029073 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.036037 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj5d7\" (UniqueName: \"kubernetes.io/projected/c755e546-15de-4347-bc90-03a4ea362583-kube-api-access-gj5d7\") pod \"node-ca-vwftl\" (UID: \"c755e546-15de-4347-bc90-03a4ea362583\") " pod="openshift-image-registry/node-ca-vwftl" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.045798 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.059547 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.077601 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.086319 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.086347 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.086355 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.086372 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.086387 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.091283 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" event={"ID":"7e902b2c-8bf9-49e8-9820-392f34dbfb10","Type":"ContainerStarted","Data":"245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff"} Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.097630 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerStarted","Data":"543040e1a4d8163c569412e628ca907103b064cdc4e80cc31c4c5dc00042fde8"} Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.098102 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.108291 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.126832 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.141629 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.164076 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.166696 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.168072 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vwftl" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.191549 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.191606 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.191628 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.191697 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.191717 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.194673 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.206417 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.223938 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.236527 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.254141 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.268876 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.281575 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.294332 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.295291 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.295326 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.295338 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.295358 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.295370 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.305484 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.318147 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.337622 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.348453 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.363258 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.374531 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.390709 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.398769 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.398800 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.398809 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.398828 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.398838 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.402913 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.422343 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543040e1a4d8163c569412e628ca907103b064cdc4e80cc31c4c5dc00042fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.442912 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.453921 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.464096 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.478586 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.491672 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.501344 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.501388 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.501398 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.501418 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.501431 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.506176 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.519272 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.531054 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.542585 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.604314 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.604364 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.604378 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.604401 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.604414 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.706966 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.707319 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.707410 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.707514 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.707606 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.811263 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.811555 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.811699 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.811803 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.811918 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.914965 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.915000 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.915009 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.915025 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:07 crc kubenswrapper[4906]: I0310 00:08:07.915035 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:07Z","lastTransitionTime":"2026-03-10T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.018446 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.018817 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.018974 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.019113 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.019268 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.103494 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vwftl" event={"ID":"c755e546-15de-4347-bc90-03a4ea362583","Type":"ContainerStarted","Data":"8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909"} Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.103588 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vwftl" event={"ID":"c755e546-15de-4347-bc90-03a4ea362583","Type":"ContainerStarted","Data":"f975a9c09d57a4d4ba82a5a11c8c3d75cb5a93ab4edd50efa124415d548b8138"} Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.104297 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.104336 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.122357 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.122397 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.122409 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.122429 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.122442 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.124234 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.146403 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.167351 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.168399 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.191222 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.223789 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.226137 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.226254 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.226335 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.226424 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.226455 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.237688 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.252305 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.277692 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.295761 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.315618 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.329807 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.329861 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.329879 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.329908 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.329924 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.341376 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.357569 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.380868 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543040e1a4d8163c569412e628ca907103b064cdc4e80cc31c4c5dc00042fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.395922 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.414506 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.427934 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.432288 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.432336 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.432346 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.432366 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.432381 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.446311 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.467983 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.483176 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.496621 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.506199 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.516553 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.527183 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.535216 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.535265 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.535277 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.535299 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.535313 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.542786 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.557520 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.572442 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.576256 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.576290 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:08 crc kubenswrapper[4906]: E0310 00:08:08.576393 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.576414 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:08 crc kubenswrapper[4906]: E0310 00:08:08.576474 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:08 crc kubenswrapper[4906]: E0310 00:08:08.576557 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.584167 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.597120 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.607095 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.625546 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543040e1a4d8163c569412e628ca907103b064cdc4e80cc31c4c5dc00042fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:08Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.637814 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.637862 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.637875 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.637925 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.637938 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.740549 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.740599 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.740610 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.740630 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.740668 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.848227 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.848269 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.848281 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.848298 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.848308 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.950293 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.950347 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.950362 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.950383 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:08 crc kubenswrapper[4906]: I0310 00:08:08.950396 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:08Z","lastTransitionTime":"2026-03-10T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.052780 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.052806 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.053132 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.053181 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.053195 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.156499 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.156575 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.156594 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.156618 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.156663 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.259252 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.259285 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.259293 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.259308 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.259317 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.362430 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.362483 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.362496 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.362517 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.362531 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.465672 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.465741 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.465761 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.465793 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.465814 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.568779 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.568841 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.568863 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.568892 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.568913 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.593400 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.671919 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.671979 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.671994 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.672015 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.672032 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.774882 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.774935 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.774947 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.774968 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.774980 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.878298 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.878350 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.878362 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.878386 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.878399 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.981464 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.981535 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.981555 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.981593 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:09 crc kubenswrapper[4906]: I0310 00:08:09.981696 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:09Z","lastTransitionTime":"2026-03-10T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.084413 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.084458 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.084467 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.084484 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.084494 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.112176 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hskrb_c9f87520-6105-4b6f-ba5a-a232b5dc24c0/ovnkube-controller/0.log" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.115283 4906 generic.go:334] "Generic (PLEG): container finished" podID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerID="543040e1a4d8163c569412e628ca907103b064cdc4e80cc31c4c5dc00042fde8" exitCode=1 Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.115377 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerDied","Data":"543040e1a4d8163c569412e628ca907103b064cdc4e80cc31c4c5dc00042fde8"} Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.116086 4906 scope.go:117] "RemoveContainer" containerID="543040e1a4d8163c569412e628ca907103b064cdc4e80cc31c4c5dc00042fde8" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.145019 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.157811 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.173255 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.188026 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.188166 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.188193 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.188330 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.188365 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.191326 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.206095 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.221369 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fabe7c81-1aa3-4cec-a99f-4697dbe6921d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab4356524a9cb048a66600ffecef81aa5f9f442b722d6780409970b0fe775d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:16.678293 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:16.681109 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:16.724153 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:16.731026 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:44.062582 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:44.062735 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd01291605de2fa2a88cfaff323d3a654acbaff0550f4faa872560a04ac582b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3090d401f5153e72de6163581cf256a48f9e2aed39a2a1ab1911f0322fc2fd4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.234848 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.246877 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.260092 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.276827 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.291705 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.291737 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.291746 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.291762 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.291770 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.294910 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.310578 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.327071 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.337471 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.362605 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://543040e1a4d8163c569412e628ca907103b064cdc4e80cc31c4c5dc00042fde8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://543040e1a4d8163c569412e628ca907103b064cdc4e80cc31c4c5dc00042fde8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:09Z\\\",\\\"message\\\":\\\"00:08:09.409751 6733 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:09.409857 6733 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:09.409903 6733 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:09.409945 6733 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:09.409969 6733 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:09.411204 6733 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 00:08:09.411256 6733 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:09.411269 6733 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:09.411304 6733 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:09.411335 6733 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:09.411336 6733 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:09.411353 6733 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:09.411358 6733 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:09.411377 6733 factory.go:656] Stopping watch factory\\\\nI0310 00:08:09.411379 6733 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:09.411400 6733 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.373139 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:10Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.394269 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.394314 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.394324 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.394346 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.394357 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.496849 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.496907 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.496926 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.496953 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.496969 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.576687 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:10 crc kubenswrapper[4906]: E0310 00:08:10.576871 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.577146 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:10 crc kubenswrapper[4906]: E0310 00:08:10.577214 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.577430 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:10 crc kubenswrapper[4906]: E0310 00:08:10.577508 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.599171 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.599209 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.599221 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.599239 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:10 crc kubenswrapper[4906]: I0310 00:08:10.599252 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:10Z","lastTransitionTime":"2026-03-10T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.526371 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.526414 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.526426 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.526448 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.526462 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.545737 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hskrb_c9f87520-6105-4b6f-ba5a-a232b5dc24c0/ovnkube-controller/0.log" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.551128 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.551208 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.551218 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerStarted","Data":"48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7"} Mar 10 00:08:12 crc kubenswrapper[4906]: E0310 00:08:12.551905 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.552006 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.552625 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:12 crc kubenswrapper[4906]: E0310 00:08:12.552854 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:12 crc kubenswrapper[4906]: E0310 00:08:12.553049 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.604695 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.629478 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.629517 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.629526 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.629543 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.629554 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.635357 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.652944 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.665550 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.675146 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.690577 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://543040e1a4d8163c569412e628ca907103b064cdc4e80cc31c4c5dc00042fde8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:09Z\\\",\\\"message\\\":\\\"00:08:09.409751 6733 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:09.409857 6733 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:09.409903 6733 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:09.409945 6733 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:09.409969 6733 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:09.411204 6733 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 00:08:09.411256 6733 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:09.411269 6733 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:09.411304 6733 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:09.411335 6733 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:09.411336 6733 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:09.411353 6733 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:09.411358 6733 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:09.411377 6733 factory.go:656] Stopping watch factory\\\\nI0310 00:08:09.411379 6733 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:09.411400 6733 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.714975 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.731733 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.732320 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.732360 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.732373 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.732392 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.732403 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.744858 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.756652 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.770585 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.784775 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.796338 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fabe7c81-1aa3-4cec-a99f-4697dbe6921d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab4356524a9cb048a66600ffecef81aa5f9f442b722d6780409970b0fe775d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:16.678293 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:16.681109 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:16.724153 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:16.731026 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:44.062582 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:44.062735 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd01291605de2fa2a88cfaff323d3a654acbaff0550f4faa872560a04ac582b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3090d401f5153e72de6163581cf256a48f9e2aed39a2a1ab1911f0322fc2fd4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.807003 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.819924 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.830450 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.834933 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.835034 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.835093 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.835161 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.835296 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.890465 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq"] Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.891748 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.893416 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.894065 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.904324 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.917798 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.923870 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f18ec986-2e62-4a63-b0b1-8780eac64def-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5zmkq\" (UID: \"f18ec986-2e62-4a63-b0b1-8780eac64def\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.923922 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f18ec986-2e62-4a63-b0b1-8780eac64def-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5zmkq\" (UID: \"f18ec986-2e62-4a63-b0b1-8780eac64def\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.924016 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f18ec986-2e62-4a63-b0b1-8780eac64def-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5zmkq\" (UID: \"f18ec986-2e62-4a63-b0b1-8780eac64def\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.924079 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwjjq\" (UniqueName: \"kubernetes.io/projected/f18ec986-2e62-4a63-b0b1-8780eac64def-kube-api-access-bwjjq\") pod \"ovnkube-control-plane-749d76644c-5zmkq\" (UID: \"f18ec986-2e62-4a63-b0b1-8780eac64def\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.929734 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fabe7c81-1aa3-4cec-a99f-4697dbe6921d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab4356524a9cb048a66600ffecef81aa5f9f442b722d6780409970b0fe775d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:16.678293 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:16.681109 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:16.724153 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:16.731026 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:44.062582 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:44.062735 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd01291605de2fa2a88cfaff323d3a654acbaff0550f4faa872560a04ac582b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3090d401f5153e72de6163581cf256a48f9e2aed39a2a1ab1911f0322fc2fd4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.938411 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.938445 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.938453 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.938469 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.938477 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:12Z","lastTransitionTime":"2026-03-10T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.941575 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.952281 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.962957 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.974701 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.985818 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f18ec986-2e62-4a63-b0b1-8780eac64def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5zmkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:12 crc kubenswrapper[4906]: I0310 00:08:12.996842 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.008729 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.017737 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.024916 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f18ec986-2e62-4a63-b0b1-8780eac64def-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5zmkq\" (UID: \"f18ec986-2e62-4a63-b0b1-8780eac64def\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.025180 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f18ec986-2e62-4a63-b0b1-8780eac64def-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5zmkq\" (UID: \"f18ec986-2e62-4a63-b0b1-8780eac64def\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.025354 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f18ec986-2e62-4a63-b0b1-8780eac64def-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5zmkq\" (UID: \"f18ec986-2e62-4a63-b0b1-8780eac64def\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.025532 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwjjq\" (UniqueName: \"kubernetes.io/projected/f18ec986-2e62-4a63-b0b1-8780eac64def-kube-api-access-bwjjq\") pod \"ovnkube-control-plane-749d76644c-5zmkq\" (UID: \"f18ec986-2e62-4a63-b0b1-8780eac64def\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.025981 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f18ec986-2e62-4a63-b0b1-8780eac64def-env-overrides\") pod \"ovnkube-control-plane-749d76644c-5zmkq\" (UID: \"f18ec986-2e62-4a63-b0b1-8780eac64def\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.027157 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f18ec986-2e62-4a63-b0b1-8780eac64def-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-5zmkq\" (UID: \"f18ec986-2e62-4a63-b0b1-8780eac64def\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.032391 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f18ec986-2e62-4a63-b0b1-8780eac64def-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-5zmkq\" (UID: \"f18ec986-2e62-4a63-b0b1-8780eac64def\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.041417 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.041456 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.041469 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.041487 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.041499 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.043477 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://543040e1a4d8163c569412e628ca907103b064cdc4e80cc31c4c5dc00042fde8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:09Z\\\",\\\"message\\\":\\\"00:08:09.409751 6733 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:09.409857 6733 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:09.409903 6733 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:09.409945 6733 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:09.409969 6733 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:09.411204 6733 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 00:08:09.411256 6733 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:09.411269 6733 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:09.411304 6733 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:09.411335 6733 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:09.411336 6733 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:09.411353 6733 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:09.411358 6733 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:09.411377 6733 factory.go:656] Stopping watch factory\\\\nI0310 00:08:09.411379 6733 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:09.411400 6733 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.048789 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwjjq\" (UniqueName: \"kubernetes.io/projected/f18ec986-2e62-4a63-b0b1-8780eac64def-kube-api-access-bwjjq\") pod \"ovnkube-control-plane-749d76644c-5zmkq\" (UID: \"f18ec986-2e62-4a63-b0b1-8780eac64def\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.056465 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.070533 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.092982 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.104735 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.116098 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.144235 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.144281 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.144290 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.144307 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.144318 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.202541 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" Mar 10 00:08:13 crc kubenswrapper[4906]: W0310 00:08:13.227902 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf18ec986_2e62_4a63_b0b1_8780eac64def.slice/crio-7324b3e95b22afcbbf9e0aa5cc8fb295774776bff7e6ee9bdda8b71d6cc283a7 WatchSource:0}: Error finding container 7324b3e95b22afcbbf9e0aa5cc8fb295774776bff7e6ee9bdda8b71d6cc283a7: Status 404 returned error can't find the container with id 7324b3e95b22afcbbf9e0aa5cc8fb295774776bff7e6ee9bdda8b71d6cc283a7 Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.247167 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.247196 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.247205 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.247221 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.247232 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.349656 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.349723 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.349741 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.349770 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.349788 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.452192 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.452227 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.452237 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.452255 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.452264 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.554259 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.554296 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.554309 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.554361 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.554372 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.557099 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hskrb_c9f87520-6105-4b6f-ba5a-a232b5dc24c0/ovnkube-controller/1.log" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.557803 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hskrb_c9f87520-6105-4b6f-ba5a-a232b5dc24c0/ovnkube-controller/0.log" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.560232 4906 generic.go:334] "Generic (PLEG): container finished" podID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerID="48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7" exitCode=1 Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.560312 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerDied","Data":"48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7"} Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.560356 4906 scope.go:117] "RemoveContainer" containerID="543040e1a4d8163c569412e628ca907103b064cdc4e80cc31c4c5dc00042fde8" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.561942 4906 scope.go:117] "RemoveContainer" containerID="48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7" Mar 10 00:08:13 crc kubenswrapper[4906]: E0310 00:08:13.562235 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hskrb_openshift-ovn-kubernetes(c9f87520-6105-4b6f-ba5a-a232b5dc24c0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.563728 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" event={"ID":"f18ec986-2e62-4a63-b0b1-8780eac64def","Type":"ContainerStarted","Data":"ebd1becf60e90db5a68bbcd8dc00cf672fd5b9f329aa84dd8a798060e8bb385e"} Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.563770 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" event={"ID":"f18ec986-2e62-4a63-b0b1-8780eac64def","Type":"ContainerStarted","Data":"cc46d6cc757c0d334c48e6458543df7d62ae0209adab82b8e8f902b1c686a187"} Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.563786 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" event={"ID":"f18ec986-2e62-4a63-b0b1-8780eac64def","Type":"ContainerStarted","Data":"7324b3e95b22afcbbf9e0aa5cc8fb295774776bff7e6ee9bdda8b71d6cc283a7"} Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.574237 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.586907 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f18ec986-2e62-4a63-b0b1-8780eac64def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5zmkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.600299 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.608197 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-5bn7b"] Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.608669 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:13 crc kubenswrapper[4906]: E0310 00:08:13.608725 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.620497 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.632760 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs\") pod \"network-metrics-daemon-5bn7b\" (UID: \"55d944ce-605e-41a7-9211-a5bc388145f1\") " pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.632809 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4pvg\" (UniqueName: \"kubernetes.io/projected/55d944ce-605e-41a7-9211-a5bc388145f1-kube-api-access-x4pvg\") pod \"network-metrics-daemon-5bn7b\" (UID: \"55d944ce-605e-41a7-9211-a5bc388145f1\") " pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.634744 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.654098 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://543040e1a4d8163c569412e628ca907103b064cdc4e80cc31c4c5dc00042fde8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:09Z\\\",\\\"message\\\":\\\"00:08:09.409751 6733 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:09.409857 6733 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:09.409903 6733 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:09.409945 6733 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:09.409969 6733 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:09.411204 6733 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 00:08:09.411256 6733 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:09.411269 6733 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:09.411304 6733 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:09.411335 6733 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:09.411336 6733 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:09.411353 6733 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:09.411358 6733 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:09.411377 6733 factory.go:656] Stopping watch factory\\\\nI0310 00:08:09.411379 6733 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:09.411400 6733 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"message\\\":\\\"ormers/externalversions/factory.go:141\\\\nI0310 00:08:12.737120 6898 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:12.737152 6898 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:12.737610 6898 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:12.741981 6898 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:12.742032 6898 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 00:08:12.742078 6898 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:12.742186 6898 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:12.742390 6898 factory.go:656] Stopping watch factory\\\\nI0310 00:08:12.773613 6898 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0310 00:08:12.773663 6898 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0310 00:08:12.773729 6898 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:12.773753 6898 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:12.773838 6898 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.657009 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.657050 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.657059 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.657076 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.657086 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.667705 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.692676 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.706042 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.717498 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.728439 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.734138 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs\") pod \"network-metrics-daemon-5bn7b\" (UID: \"55d944ce-605e-41a7-9211-a5bc388145f1\") " pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.734177 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4pvg\" (UniqueName: \"kubernetes.io/projected/55d944ce-605e-41a7-9211-a5bc388145f1-kube-api-access-x4pvg\") pod \"network-metrics-daemon-5bn7b\" (UID: \"55d944ce-605e-41a7-9211-a5bc388145f1\") " pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:13 crc kubenswrapper[4906]: E0310 00:08:13.734353 4906 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:13 crc kubenswrapper[4906]: E0310 00:08:13.734513 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs podName:55d944ce-605e-41a7-9211-a5bc388145f1 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:14.234487915 +0000 UTC m=+120.382383087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs") pod "network-metrics-daemon-5bn7b" (UID: "55d944ce-605e-41a7-9211-a5bc388145f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.741439 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.750145 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4pvg\" (UniqueName: \"kubernetes.io/projected/55d944ce-605e-41a7-9211-a5bc388145f1-kube-api-access-x4pvg\") pod \"network-metrics-daemon-5bn7b\" (UID: \"55d944ce-605e-41a7-9211-a5bc388145f1\") " pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.754700 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fabe7c81-1aa3-4cec-a99f-4697dbe6921d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab4356524a9cb048a66600ffecef81aa5f9f442b722d6780409970b0fe775d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:16.678293 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:16.681109 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:16.724153 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:16.731026 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:44.062582 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:44.062735 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd01291605de2fa2a88cfaff323d3a654acbaff0550f4faa872560a04ac582b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3090d401f5153e72de6163581cf256a48f9e2aed39a2a1ab1911f0322fc2fd4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.759205 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.759255 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.759291 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.759315 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.759327 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.767362 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.777578 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.788367 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.801497 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.818522 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://543040e1a4d8163c569412e628ca907103b064cdc4e80cc31c4c5dc00042fde8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:09Z\\\",\\\"message\\\":\\\"00:08:09.409751 6733 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:09.409857 6733 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:09.409903 6733 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:09.409945 6733 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:09.409969 6733 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:09.411204 6733 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 00:08:09.411256 6733 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:09.411269 6733 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:09.411304 6733 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:09.411335 6733 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:09.411336 6733 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:09.411353 6733 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:09.411358 6733 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:09.411377 6733 factory.go:656] Stopping watch factory\\\\nI0310 00:08:09.411379 6733 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 00:08:09.411400 6733 ovnkube.go:599] Stopped ovnkube\\\\nI0310 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"message\\\":\\\"ormers/externalversions/factory.go:141\\\\nI0310 00:08:12.737120 6898 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:12.737152 6898 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:12.737610 6898 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:12.741981 6898 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:12.742032 6898 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 00:08:12.742078 6898 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:12.742186 6898 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:12.742390 6898 factory.go:656] Stopping watch factory\\\\nI0310 00:08:12.773613 6898 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0310 00:08:12.773663 6898 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0310 00:08:12.773729 6898 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:12.773753 6898 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:12.773838 6898 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.827998 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.839297 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.848101 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.857740 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.861693 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.861736 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.861750 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.861769 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.861781 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.869796 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.881710 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.898197 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.908729 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.918092 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.925780 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.936004 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.945703 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fabe7c81-1aa3-4cec-a99f-4697dbe6921d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab4356524a9cb048a66600ffecef81aa5f9f442b722d6780409970b0fe775d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:16.678293 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:16.681109 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:16.724153 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:16.731026 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:44.062582 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:44.062735 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd01291605de2fa2a88cfaff323d3a654acbaff0550f4faa872560a04ac582b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3090d401f5153e72de6163581cf256a48f9e2aed39a2a1ab1911f0322fc2fd4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.956238 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.964936 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.964987 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.965028 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.965056 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.965070 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:13Z","lastTransitionTime":"2026-03-10T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.965560 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bn7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d944ce-605e-41a7-9211-a5bc388145f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bn7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.977736 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.989553 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:13 crc kubenswrapper[4906]: I0310 00:08:13.999258 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f18ec986-2e62-4a63-b0b1-8780eac64def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc46d6cc757c0d334c48e6458543df7d62ae0209adab82b8e8f902b1c686a187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd1becf60e90db5a68bbcd8dc00cf672fd5b9f329aa84dd8a798060e8bb385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5zmkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:13Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.067979 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.068023 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.068032 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.068050 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.068059 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.171259 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.171304 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.171317 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.171335 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.171346 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.239289 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs\") pod \"network-metrics-daemon-5bn7b\" (UID: \"55d944ce-605e-41a7-9211-a5bc388145f1\") " pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:14 crc kubenswrapper[4906]: E0310 00:08:14.239490 4906 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:14 crc kubenswrapper[4906]: E0310 00:08:14.239597 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs podName:55d944ce-605e-41a7-9211-a5bc388145f1 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:15.239572268 +0000 UTC m=+121.387467390 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs") pod "network-metrics-daemon-5bn7b" (UID: "55d944ce-605e-41a7-9211-a5bc388145f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.273598 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.273684 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.273704 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.273729 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.273741 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.376718 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.376762 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.376777 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.376793 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.376804 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.480209 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.480291 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.480316 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.480345 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.480362 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:14Z","lastTransitionTime":"2026-03-10T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.570426 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hskrb_c9f87520-6105-4b6f-ba5a-a232b5dc24c0/ovnkube-controller/1.log" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.575620 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.575706 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.575770 4906 scope.go:117] "RemoveContainer" containerID="48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7" Mar 10 00:08:14 crc kubenswrapper[4906]: E0310 00:08:14.575822 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.575890 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:14 crc kubenswrapper[4906]: E0310 00:08:14.575998 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hskrb_openshift-ovn-kubernetes(c9f87520-6105-4b6f-ba5a-a232b5dc24c0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" Mar 10 00:08:14 crc kubenswrapper[4906]: E0310 00:08:14.575932 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:14 crc kubenswrapper[4906]: E0310 00:08:14.576177 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:14 crc kubenswrapper[4906]: E0310 00:08:14.580797 4906 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.585203 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.605413 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.620078 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.653614 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"message\\\":\\\"ormers/externalversions/factory.go:141\\\\nI0310 00:08:12.737120 6898 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:12.737152 6898 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:12.737610 6898 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:12.741981 6898 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:12.742032 6898 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 00:08:12.742078 6898 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:12.742186 6898 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:12.742390 6898 factory.go:656] Stopping watch factory\\\\nI0310 00:08:12.773613 6898 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0310 00:08:12.773663 6898 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0310 00:08:12.773729 6898 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:12.773753 6898 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:12.773838 6898 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hskrb_openshift-ovn-kubernetes(c9f87520-6105-4b6f-ba5a-a232b5dc24c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.672139 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: E0310 00:08:14.687410 4906 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.692461 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.707019 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.724798 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.738429 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.757926 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.775148 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fabe7c81-1aa3-4cec-a99f-4697dbe6921d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab4356524a9cb048a66600ffecef81aa5f9f442b722d6780409970b0fe775d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:16.678293 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:16.681109 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:16.724153 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:16.731026 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:44.062582 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:44.062735 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd01291605de2fa2a88cfaff323d3a654acbaff0550f4faa872560a04ac582b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3090d401f5153e72de6163581cf256a48f9e2aed39a2a1ab1911f0322fc2fd4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.789753 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.802847 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.814296 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.825154 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.838703 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.850865 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f18ec986-2e62-4a63-b0b1-8780eac64def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc46d6cc757c0d334c48e6458543df7d62ae0209adab82b8e8f902b1c686a187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd1becf60e90db5a68bbcd8dc00cf672fd5b9f329aa84dd8a798060e8bb385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5zmkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.863386 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bn7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d944ce-605e-41a7-9211-a5bc388145f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bn7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.877005 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f18ec986-2e62-4a63-b0b1-8780eac64def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc46d6cc757c0d334c48e6458543df7d62ae0209adab82b8e8f902b1c686a187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd1becf60e90db5a68bbcd8dc00cf672fd5b9f329aa84dd8a798060e8bb385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5zmkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.891096 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bn7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d944ce-605e-41a7-9211-a5bc388145f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bn7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.905456 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.923010 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.937891 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.965445 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"message\\\":\\\"ormers/externalversions/factory.go:141\\\\nI0310 00:08:12.737120 6898 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:12.737152 6898 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:12.737610 6898 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:12.741981 6898 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:12.742032 6898 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 00:08:12.742078 6898 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:12.742186 6898 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:12.742390 6898 factory.go:656] Stopping watch factory\\\\nI0310 00:08:12.773613 6898 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0310 00:08:12.773663 6898 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0310 00:08:12.773729 6898 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:12.773753 6898 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:12.773838 6898 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hskrb_openshift-ovn-kubernetes(c9f87520-6105-4b6f-ba5a-a232b5dc24c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.979911 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:14 crc kubenswrapper[4906]: I0310 00:08:14.995252 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:14Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.010045 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.022913 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.037536 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.053255 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.077561 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.096474 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.110049 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.121874 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.138892 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.153993 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fabe7c81-1aa3-4cec-a99f-4697dbe6921d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab4356524a9cb048a66600ffecef81aa5f9f442b722d6780409970b0fe775d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:16.678293 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:16.681109 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:16.724153 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:16.731026 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:44.062582 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:44.062735 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd01291605de2fa2a88cfaff323d3a654acbaff0550f4faa872560a04ac582b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3090d401f5153e72de6163581cf256a48f9e2aed39a2a1ab1911f0322fc2fd4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.250943 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs\") pod \"network-metrics-daemon-5bn7b\" (UID: \"55d944ce-605e-41a7-9211-a5bc388145f1\") " pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:15 crc kubenswrapper[4906]: E0310 00:08:15.251134 4906 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:15 crc kubenswrapper[4906]: E0310 00:08:15.251216 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs podName:55d944ce-605e-41a7-9211-a5bc388145f1 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:17.251196684 +0000 UTC m=+123.399091806 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs") pod "network-metrics-daemon-5bn7b" (UID: "55d944ce-605e-41a7-9211-a5bc388145f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.575698 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:15 crc kubenswrapper[4906]: E0310 00:08:15.575945 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.918971 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.919013 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.919024 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.919041 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.919052 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4906]: E0310 00:08:15.942430 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.948247 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.948320 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.948337 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.948367 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.948385 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4906]: E0310 00:08:15.972128 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.977003 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.977070 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.977089 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.977111 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:15 crc kubenswrapper[4906]: I0310 00:08:15.977126 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:15Z","lastTransitionTime":"2026-03-10T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:15 crc kubenswrapper[4906]: E0310 00:08:15.998070 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:15Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4906]: I0310 00:08:16.004054 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4906]: I0310 00:08:16.004146 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4906]: I0310 00:08:16.004163 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4906]: I0310 00:08:16.004399 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4906]: I0310 00:08:16.004417 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4906]: E0310 00:08:16.024005 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4906]: I0310 00:08:16.029662 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:16 crc kubenswrapper[4906]: I0310 00:08:16.029713 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:16 crc kubenswrapper[4906]: I0310 00:08:16.029725 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:16 crc kubenswrapper[4906]: I0310 00:08:16.029747 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:16 crc kubenswrapper[4906]: I0310 00:08:16.029765 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:16Z","lastTransitionTime":"2026-03-10T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:16 crc kubenswrapper[4906]: E0310 00:08:16.044499 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:16 crc kubenswrapper[4906]: E0310 00:08:16.044811 4906 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:08:16 crc kubenswrapper[4906]: I0310 00:08:16.575804 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:16 crc kubenswrapper[4906]: I0310 00:08:16.575832 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:16 crc kubenswrapper[4906]: I0310 00:08:16.576142 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:16 crc kubenswrapper[4906]: E0310 00:08:16.576285 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:16 crc kubenswrapper[4906]: E0310 00:08:16.576443 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:16 crc kubenswrapper[4906]: E0310 00:08:16.576506 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:17 crc kubenswrapper[4906]: I0310 00:08:17.281372 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs\") pod \"network-metrics-daemon-5bn7b\" (UID: \"55d944ce-605e-41a7-9211-a5bc388145f1\") " pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:17 crc kubenswrapper[4906]: E0310 00:08:17.281531 4906 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:17 crc kubenswrapper[4906]: E0310 00:08:17.281593 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs podName:55d944ce-605e-41a7-9211-a5bc388145f1 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:21.281576904 +0000 UTC m=+127.429472016 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs") pod "network-metrics-daemon-5bn7b" (UID: "55d944ce-605e-41a7-9211-a5bc388145f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:17 crc kubenswrapper[4906]: I0310 00:08:17.576033 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:17 crc kubenswrapper[4906]: E0310 00:08:17.576252 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:18 crc kubenswrapper[4906]: I0310 00:08:18.576438 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:18 crc kubenswrapper[4906]: I0310 00:08:18.576513 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:18 crc kubenswrapper[4906]: I0310 00:08:18.576453 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:18 crc kubenswrapper[4906]: E0310 00:08:18.576672 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:18 crc kubenswrapper[4906]: E0310 00:08:18.576792 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:18 crc kubenswrapper[4906]: E0310 00:08:18.576953 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:19 crc kubenswrapper[4906]: I0310 00:08:19.576529 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:19 crc kubenswrapper[4906]: E0310 00:08:19.576793 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:19 crc kubenswrapper[4906]: E0310 00:08:19.688879 4906 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:20 crc kubenswrapper[4906]: I0310 00:08:20.576366 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:20 crc kubenswrapper[4906]: I0310 00:08:20.576750 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:20 crc kubenswrapper[4906]: I0310 00:08:20.576868 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:20 crc kubenswrapper[4906]: E0310 00:08:20.577068 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:20 crc kubenswrapper[4906]: E0310 00:08:20.577286 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:20 crc kubenswrapper[4906]: E0310 00:08:20.578260 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:20 crc kubenswrapper[4906]: I0310 00:08:20.590150 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 10 00:08:21 crc kubenswrapper[4906]: I0310 00:08:21.330707 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs\") pod \"network-metrics-daemon-5bn7b\" (UID: \"55d944ce-605e-41a7-9211-a5bc388145f1\") " pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:21 crc kubenswrapper[4906]: E0310 00:08:21.330969 4906 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:21 crc kubenswrapper[4906]: E0310 00:08:21.331074 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs podName:55d944ce-605e-41a7-9211-a5bc388145f1 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:29.331043779 +0000 UTC m=+135.478938921 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs") pod "network-metrics-daemon-5bn7b" (UID: "55d944ce-605e-41a7-9211-a5bc388145f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:21 crc kubenswrapper[4906]: I0310 00:08:21.576009 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:21 crc kubenswrapper[4906]: E0310 00:08:21.576440 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:22 crc kubenswrapper[4906]: I0310 00:08:22.575746 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:22 crc kubenswrapper[4906]: E0310 00:08:22.575934 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:22 crc kubenswrapper[4906]: I0310 00:08:22.575991 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:22 crc kubenswrapper[4906]: E0310 00:08:22.576202 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:22 crc kubenswrapper[4906]: I0310 00:08:22.576544 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:22 crc kubenswrapper[4906]: E0310 00:08:22.576684 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:23 crc kubenswrapper[4906]: I0310 00:08:23.575876 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:23 crc kubenswrapper[4906]: E0310 00:08:23.576064 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.576255 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.576333 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:24 crc kubenswrapper[4906]: E0310 00:08:24.576498 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:24 crc kubenswrapper[4906]: E0310 00:08:24.576698 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.576870 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:24 crc kubenswrapper[4906]: E0310 00:08:24.577028 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.595020 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.609260 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.639157 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"message\\\":\\\"ormers/externalversions/factory.go:141\\\\nI0310 00:08:12.737120 6898 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:12.737152 6898 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:12.737610 6898 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:12.741981 6898 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:12.742032 6898 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 00:08:12.742078 6898 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:12.742186 6898 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:12.742390 6898 factory.go:656] Stopping watch factory\\\\nI0310 00:08:12.773613 6898 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0310 00:08:12.773663 6898 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0310 00:08:12.773729 6898 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:12.773753 6898 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:12.773838 6898 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-hskrb_openshift-ovn-kubernetes(c9f87520-6105-4b6f-ba5a-a232b5dc24c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.654429 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.678033 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4906]: E0310 00:08:24.689936 4906 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.702692 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.727399 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.751402 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.781803 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.803108 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bba372c-755f-4ff1-b1e4-2ccb6079d5a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0eb883e8f9f1bea4366cef5df73fd7a58ca339c0a3460cfccb92085380013dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://159a9ac970d599fe117be8d5340e4ab61b723d5765dc14fca8834807c38ead55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eca935bc2a1af8e0157cb652fecdf524d319c037423de917084bc258eb00317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.824777 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fabe7c81-1aa3-4cec-a99f-4697dbe6921d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab4356524a9cb048a66600ffecef81aa5f9f442b722d6780409970b0fe775d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:16.678293 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:16.681109 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:16.724153 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:16.731026 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:44.062582 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:44.062735 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd01291605de2fa2a88cfaff323d3a654acbaff0550f4faa872560a04ac582b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3090d401f5153e72de6163581cf256a48f9e2aed39a2a1ab1911f0322fc2fd4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.846943 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.866403 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.882845 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.932612 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.956960 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.978256 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f18ec986-2e62-4a63-b0b1-8780eac64def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc46d6cc757c0d334c48e6458543df7d62ae0209adab82b8e8f902b1c686a187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd1becf60e90db5a68bbcd8dc00cf672fd5b9f329aa84dd8a798060e8bb385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5zmkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:24 crc kubenswrapper[4906]: I0310 00:08:24.999007 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bn7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d944ce-605e-41a7-9211-a5bc388145f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bn7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:24Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:25 crc kubenswrapper[4906]: I0310 00:08:25.019478 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:25Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:25 crc kubenswrapper[4906]: I0310 00:08:25.575855 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:25 crc kubenswrapper[4906]: E0310 00:08:25.576068 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:25 crc kubenswrapper[4906]: I0310 00:08:25.577232 4906 scope.go:117] "RemoveContainer" containerID="48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.428969 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.429042 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.429067 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.429101 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.429160 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:26Z","lastTransitionTime":"2026-03-10T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:26 crc kubenswrapper[4906]: E0310 00:08:26.449955 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.456059 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.456117 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.456143 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.456178 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.456203 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:26Z","lastTransitionTime":"2026-03-10T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:26 crc kubenswrapper[4906]: E0310 00:08:26.470678 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.475348 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.475391 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.475410 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.475435 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.475452 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:26Z","lastTransitionTime":"2026-03-10T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:26 crc kubenswrapper[4906]: E0310 00:08:26.491996 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.497760 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.497840 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.497859 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.497892 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.497912 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:26Z","lastTransitionTime":"2026-03-10T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:26 crc kubenswrapper[4906]: E0310 00:08:26.518767 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.524368 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.524422 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.524439 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.524472 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.524494 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:26Z","lastTransitionTime":"2026-03-10T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:26 crc kubenswrapper[4906]: E0310 00:08:26.546570 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: E0310 00:08:26.546883 4906 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.576278 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.576424 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:26 crc kubenswrapper[4906]: E0310 00:08:26.576458 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.576445 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:26 crc kubenswrapper[4906]: E0310 00:08:26.576721 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:26 crc kubenswrapper[4906]: E0310 00:08:26.576907 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.629523 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hskrb_c9f87520-6105-4b6f-ba5a-a232b5dc24c0/ovnkube-controller/1.log" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.634126 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerStarted","Data":"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25"} Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.636122 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.653925 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.690431 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.710853 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.751011 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"message\\\":\\\"ormers/externalversions/factory.go:141\\\\nI0310 00:08:12.737120 6898 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:12.737152 6898 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:12.737610 6898 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:12.741981 6898 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:12.742032 6898 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 00:08:12.742078 6898 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:12.742186 6898 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:12.742390 6898 factory.go:656] Stopping watch factory\\\\nI0310 00:08:12.773613 6898 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0310 00:08:12.773663 6898 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0310 00:08:12.773729 6898 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:12.773753 6898 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:12.773838 6898 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.763689 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bba372c-755f-4ff1-b1e4-2ccb6079d5a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0eb883e8f9f1bea4366cef5df73fd7a58ca339c0a3460cfccb92085380013dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://159a9ac970d599fe117be8d5340e4ab61b723d5765dc14fca8834807c38ead55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eca935bc2a1af8e0157cb652fecdf524d319c037423de917084bc258eb00317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.783686 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.802676 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.818392 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.838893 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.860142 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.883776 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.904806 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fabe7c81-1aa3-4cec-a99f-4697dbe6921d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab4356524a9cb048a66600ffecef81aa5f9f442b722d6780409970b0fe775d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:16.678293 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:16.681109 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:16.724153 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:16.731026 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:44.062582 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:44.062735 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd01291605de2fa2a88cfaff323d3a654acbaff0550f4faa872560a04ac582b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3090d401f5153e72de6163581cf256a48f9e2aed39a2a1ab1911f0322fc2fd4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.931274 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.950719 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.964688 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.978386 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:26 crc kubenswrapper[4906]: I0310 00:08:26.999314 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.016823 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f18ec986-2e62-4a63-b0b1-8780eac64def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc46d6cc757c0d334c48e6458543df7d62ae0209adab82b8e8f902b1c686a187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd1becf60e90db5a68bbcd8dc00cf672fd5b9f329aa84dd8a798060e8bb385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5zmkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.032598 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bn7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d944ce-605e-41a7-9211-a5bc388145f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bn7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.576365 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:27 crc kubenswrapper[4906]: E0310 00:08:27.576579 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.641058 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hskrb_c9f87520-6105-4b6f-ba5a-a232b5dc24c0/ovnkube-controller/2.log" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.642257 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hskrb_c9f87520-6105-4b6f-ba5a-a232b5dc24c0/ovnkube-controller/1.log" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.646415 4906 generic.go:334] "Generic (PLEG): container finished" podID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerID="6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25" exitCode=1 Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.646507 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerDied","Data":"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25"} Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.646619 4906 scope.go:117] "RemoveContainer" containerID="48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.647705 4906 scope.go:117] "RemoveContainer" containerID="6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25" Mar 10 00:08:27 crc kubenswrapper[4906]: E0310 00:08:27.648021 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hskrb_openshift-ovn-kubernetes(c9f87520-6105-4b6f-ba5a-a232b5dc24c0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.670796 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.694484 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.714108 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f18ec986-2e62-4a63-b0b1-8780eac64def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc46d6cc757c0d334c48e6458543df7d62ae0209adab82b8e8f902b1c686a187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd1becf60e90db5a68bbcd8dc00cf672fd5b9f329aa84dd8a798060e8bb385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5zmkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.732968 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bn7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d944ce-605e-41a7-9211-a5bc388145f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bn7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.747204 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.767432 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.789631 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.823161 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://48ab4689eb420e0f3f45dfefc400831eba530d7711acdf241ef8455057bbf3b7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"message\\\":\\\"ormers/externalversions/factory.go:141\\\\nI0310 00:08:12.737120 6898 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:12.737152 6898 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0310 00:08:12.737610 6898 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:12.741981 6898 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:12.742032 6898 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 00:08:12.742078 6898 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 00:08:12.742186 6898 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:12.742390 6898 factory.go:656] Stopping watch factory\\\\nI0310 00:08:12.773613 6898 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0310 00:08:12.773663 6898 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0310 00:08:12.773729 6898 ovnkube.go:599] Stopped ovnkube\\\\nI0310 00:08:12.773753 6898 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0310 00:08:12.773838 6898 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"topping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:26.647077 7154 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:26.647129 7154 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:26.647139 7154 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:26.647206 7154 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 00:08:26.647203 7154 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:26.647239 7154 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:26.647247 7154 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:26.647268 7154 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:26.647250 7154 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:26.647272 7154 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:26.647315 7154 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:26.647374 7154 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:26.647426 7154 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:26.647475 7154 factory.go:656] Stopping watch factory\\\\nI0310 00:08:26.647502 7154 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.845603 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bba372c-755f-4ff1-b1e4-2ccb6079d5a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0eb883e8f9f1bea4366cef5df73fd7a58ca339c0a3460cfccb92085380013dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://159a9ac970d599fe117be8d5340e4ab61b723d5765dc14fca8834807c38ead55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eca935bc2a1af8e0157cb652fecdf524d319c037423de917084bc258eb00317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.882359 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.902727 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.925442 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.945430 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.964922 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:27 crc kubenswrapper[4906]: I0310 00:08:27.989989 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.013311 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fabe7c81-1aa3-4cec-a99f-4697dbe6921d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab4356524a9cb048a66600ffecef81aa5f9f442b722d6780409970b0fe775d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:16.678293 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:16.681109 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:16.724153 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:16.731026 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:44.062582 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:44.062735 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd01291605de2fa2a88cfaff323d3a654acbaff0550f4faa872560a04ac582b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3090d401f5153e72de6163581cf256a48f9e2aed39a2a1ab1911f0322fc2fd4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.032837 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.051588 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.067975 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.428427 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:08:28 crc kubenswrapper[4906]: E0310 00:08:28.428720 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:32.42868603 +0000 UTC m=+198.576581142 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.530290 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.530370 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.530448 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.530500 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:28 crc kubenswrapper[4906]: E0310 00:08:28.530605 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:08:28 crc kubenswrapper[4906]: E0310 00:08:28.530674 4906 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:08:28 crc kubenswrapper[4906]: E0310 00:08:28.530784 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 00:08:28 crc kubenswrapper[4906]: E0310 00:08:28.530686 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:08:28 crc kubenswrapper[4906]: E0310 00:08:28.530829 4906 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 00:08:28 crc kubenswrapper[4906]: E0310 00:08:28.530848 4906 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:28 crc kubenswrapper[4906]: E0310 00:08:28.530791 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:09:32.530766826 +0000 UTC m=+198.678661938 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 00:08:28 crc kubenswrapper[4906]: E0310 00:08:28.530849 4906 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:28 crc kubenswrapper[4906]: E0310 00:08:28.530987 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 00:09:32.530957611 +0000 UTC m=+198.678852733 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:28 crc kubenswrapper[4906]: E0310 00:08:28.530684 4906 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:08:28 crc kubenswrapper[4906]: E0310 00:08:28.531041 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 00:09:32.531010223 +0000 UTC m=+198.678905375 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 00:08:28 crc kubenswrapper[4906]: E0310 00:08:28.531151 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 00:09:32.531123296 +0000 UTC m=+198.679018618 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.576074 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.576075 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:28 crc kubenswrapper[4906]: E0310 00:08:28.576303 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.576239 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:28 crc kubenswrapper[4906]: E0310 00:08:28.576497 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:28 crc kubenswrapper[4906]: E0310 00:08:28.576746 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.654011 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hskrb_c9f87520-6105-4b6f-ba5a-a232b5dc24c0/ovnkube-controller/2.log" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.661234 4906 scope.go:117] "RemoveContainer" containerID="6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25" Mar 10 00:08:28 crc kubenswrapper[4906]: E0310 00:08:28.661546 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hskrb_openshift-ovn-kubernetes(c9f87520-6105-4b6f-ba5a-a232b5dc24c0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.685276 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bba372c-755f-4ff1-b1e4-2ccb6079d5a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0eb883e8f9f1bea4366cef5df73fd7a58ca339c0a3460cfccb92085380013dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://159a9ac970d599fe117be8d5340e4ab61b723d5765dc14fca8834807c38ead55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eca935bc2a1af8e0157cb652fecdf524d319c037423de917084bc258eb00317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.721284 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.745302 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.769268 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.791249 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.814185 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.838387 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.863147 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fabe7c81-1aa3-4cec-a99f-4697dbe6921d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab4356524a9cb048a66600ffecef81aa5f9f442b722d6780409970b0fe775d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:16.678293 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:16.681109 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:16.724153 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:16.731026 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:44.062582 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:44.062735 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd01291605de2fa2a88cfaff323d3a654acbaff0550f4faa872560a04ac582b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3090d401f5153e72de6163581cf256a48f9e2aed39a2a1ab1911f0322fc2fd4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.886310 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.908455 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.926730 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.949356 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.971771 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:28 crc kubenswrapper[4906]: I0310 00:08:28.990988 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f18ec986-2e62-4a63-b0b1-8780eac64def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc46d6cc757c0d334c48e6458543df7d62ae0209adab82b8e8f902b1c686a187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd1becf60e90db5a68bbcd8dc00cf672fd5b9f329aa84dd8a798060e8bb385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5zmkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:28Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:29 crc kubenswrapper[4906]: I0310 00:08:29.007527 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bn7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d944ce-605e-41a7-9211-a5bc388145f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bn7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:29 crc kubenswrapper[4906]: I0310 00:08:29.023765 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:29 crc kubenswrapper[4906]: I0310 00:08:29.046702 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:29 crc kubenswrapper[4906]: I0310 00:08:29.064289 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:29 crc kubenswrapper[4906]: I0310 00:08:29.098814 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"topping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:26.647077 7154 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:26.647129 7154 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:26.647139 7154 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:26.647206 7154 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 00:08:26.647203 7154 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:26.647239 7154 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:26.647247 7154 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:26.647268 7154 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:26.647250 7154 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:26.647272 7154 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:26.647315 7154 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:26.647374 7154 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:26.647426 7154 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:26.647475 7154 factory.go:656] Stopping watch factory\\\\nI0310 00:08:26.647502 7154 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hskrb_openshift-ovn-kubernetes(c9f87520-6105-4b6f-ba5a-a232b5dc24c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:29Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:29 crc kubenswrapper[4906]: I0310 00:08:29.341117 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs\") pod \"network-metrics-daemon-5bn7b\" (UID: \"55d944ce-605e-41a7-9211-a5bc388145f1\") " pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:29 crc kubenswrapper[4906]: E0310 00:08:29.341371 4906 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:29 crc kubenswrapper[4906]: E0310 00:08:29.341525 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs podName:55d944ce-605e-41a7-9211-a5bc388145f1 nodeName:}" failed. No retries permitted until 2026-03-10 00:08:45.341487678 +0000 UTC m=+151.489382830 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs") pod "network-metrics-daemon-5bn7b" (UID: "55d944ce-605e-41a7-9211-a5bc388145f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:29 crc kubenswrapper[4906]: I0310 00:08:29.576010 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:29 crc kubenswrapper[4906]: E0310 00:08:29.576227 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:29 crc kubenswrapper[4906]: E0310 00:08:29.691957 4906 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:30 crc kubenswrapper[4906]: I0310 00:08:30.576119 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:30 crc kubenswrapper[4906]: I0310 00:08:30.576181 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:30 crc kubenswrapper[4906]: E0310 00:08:30.576361 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:30 crc kubenswrapper[4906]: I0310 00:08:30.576461 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:30 crc kubenswrapper[4906]: E0310 00:08:30.576563 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:30 crc kubenswrapper[4906]: E0310 00:08:30.576902 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:31 crc kubenswrapper[4906]: I0310 00:08:31.576121 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:31 crc kubenswrapper[4906]: E0310 00:08:31.576295 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:32 crc kubenswrapper[4906]: I0310 00:08:32.575984 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:32 crc kubenswrapper[4906]: I0310 00:08:32.576031 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:32 crc kubenswrapper[4906]: I0310 00:08:32.576037 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:32 crc kubenswrapper[4906]: E0310 00:08:32.576212 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:32 crc kubenswrapper[4906]: E0310 00:08:32.576338 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:32 crc kubenswrapper[4906]: E0310 00:08:32.576503 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:33 crc kubenswrapper[4906]: I0310 00:08:33.576342 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:33 crc kubenswrapper[4906]: E0310 00:08:33.576567 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.575899 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.576097 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:34 crc kubenswrapper[4906]: E0310 00:08:34.576339 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.576516 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:34 crc kubenswrapper[4906]: E0310 00:08:34.576708 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:34 crc kubenswrapper[4906]: E0310 00:08:34.576933 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.595823 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.616213 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.638726 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.658187 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bba372c-755f-4ff1-b1e4-2ccb6079d5a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0eb883e8f9f1bea4366cef5df73fd7a58ca339c0a3460cfccb92085380013dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://159a9ac970d599fe117be8d5340e4ab61b723d5765dc14fca8834807c38ead55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eca935bc2a1af8e0157cb652fecdf524d319c037423de917084bc258eb00317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.692411 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: E0310 00:08:34.692699 4906 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.716485 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.734400 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.751816 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.773299 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.793843 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fabe7c81-1aa3-4cec-a99f-4697dbe6921d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab4356524a9cb048a66600ffecef81aa5f9f442b722d6780409970b0fe775d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:16.678293 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:16.681109 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:16.724153 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:16.731026 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:44.062582 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:44.062735 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd01291605de2fa2a88cfaff323d3a654acbaff0550f4faa872560a04ac582b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3090d401f5153e72de6163581cf256a48f9e2aed39a2a1ab1911f0322fc2fd4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.815587 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.834585 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bn7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d944ce-605e-41a7-9211-a5bc388145f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bn7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.854148 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.870955 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.886804 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f18ec986-2e62-4a63-b0b1-8780eac64def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc46d6cc757c0d334c48e6458543df7d62ae0209adab82b8e8f902b1c686a187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd1becf60e90db5a68bbcd8dc00cf672fd5b9f329aa84dd8a798060e8bb385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5zmkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.912275 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"topping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:26.647077 7154 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:26.647129 7154 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:26.647139 7154 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:26.647206 7154 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 00:08:26.647203 7154 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:26.647239 7154 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:26.647247 7154 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:26.647268 7154 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:26.647250 7154 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:26.647272 7154 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:26.647315 7154 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:26.647374 7154 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:26.647426 7154 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:26.647475 7154 factory.go:656] Stopping watch factory\\\\nI0310 00:08:26.647502 7154 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hskrb_openshift-ovn-kubernetes(c9f87520-6105-4b6f-ba5a-a232b5dc24c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.931288 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.954393 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:34 crc kubenswrapper[4906]: I0310 00:08:34.970855 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:34Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:35 crc kubenswrapper[4906]: I0310 00:08:35.575741 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:35 crc kubenswrapper[4906]: E0310 00:08:35.575956 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.576597 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.576597 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.576811 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:36 crc kubenswrapper[4906]: E0310 00:08:36.577031 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:36 crc kubenswrapper[4906]: E0310 00:08:36.577148 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:36 crc kubenswrapper[4906]: E0310 00:08:36.577304 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.804185 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.804230 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.804242 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.804260 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.804271 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:36Z","lastTransitionTime":"2026-03-10T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:36 crc kubenswrapper[4906]: E0310 00:08:36.819683 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.823276 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.823306 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.823315 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.823333 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.823342 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:36Z","lastTransitionTime":"2026-03-10T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:36 crc kubenswrapper[4906]: E0310 00:08:36.833200 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.836106 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.836136 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.836145 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.836159 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.836168 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:36Z","lastTransitionTime":"2026-03-10T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:36 crc kubenswrapper[4906]: E0310 00:08:36.845583 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.848587 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.848619 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.848644 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.848666 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.848678 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:36Z","lastTransitionTime":"2026-03-10T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:36 crc kubenswrapper[4906]: E0310 00:08:36.859011 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.861933 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.861961 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.861973 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.861989 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:36 crc kubenswrapper[4906]: I0310 00:08:36.861999 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:36Z","lastTransitionTime":"2026-03-10T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:36 crc kubenswrapper[4906]: E0310 00:08:36.872231 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:36 crc kubenswrapper[4906]: E0310 00:08:36.872344 4906 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:08:37 crc kubenswrapper[4906]: I0310 00:08:37.576365 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:37 crc kubenswrapper[4906]: E0310 00:08:37.576523 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:38 crc kubenswrapper[4906]: I0310 00:08:38.576582 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:38 crc kubenswrapper[4906]: I0310 00:08:38.576580 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:38 crc kubenswrapper[4906]: I0310 00:08:38.576597 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:38 crc kubenswrapper[4906]: E0310 00:08:38.576786 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:38 crc kubenswrapper[4906]: E0310 00:08:38.576978 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:38 crc kubenswrapper[4906]: E0310 00:08:38.577033 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:39 crc kubenswrapper[4906]: I0310 00:08:39.575956 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:39 crc kubenswrapper[4906]: E0310 00:08:39.576166 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:39 crc kubenswrapper[4906]: E0310 00:08:39.693745 4906 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:40 crc kubenswrapper[4906]: I0310 00:08:40.576842 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:40 crc kubenswrapper[4906]: I0310 00:08:40.576927 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:40 crc kubenswrapper[4906]: I0310 00:08:40.576855 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:40 crc kubenswrapper[4906]: E0310 00:08:40.577107 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:40 crc kubenswrapper[4906]: E0310 00:08:40.577161 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:40 crc kubenswrapper[4906]: E0310 00:08:40.577257 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:41 crc kubenswrapper[4906]: I0310 00:08:41.576144 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:41 crc kubenswrapper[4906]: E0310 00:08:41.576386 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:42 crc kubenswrapper[4906]: I0310 00:08:42.576819 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:42 crc kubenswrapper[4906]: E0310 00:08:42.577451 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:42 crc kubenswrapper[4906]: I0310 00:08:42.577833 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:42 crc kubenswrapper[4906]: I0310 00:08:42.577891 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:42 crc kubenswrapper[4906]: E0310 00:08:42.577928 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:42 crc kubenswrapper[4906]: E0310 00:08:42.578074 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:43 crc kubenswrapper[4906]: I0310 00:08:43.575999 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:43 crc kubenswrapper[4906]: E0310 00:08:43.576394 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:43 crc kubenswrapper[4906]: I0310 00:08:43.577317 4906 scope.go:117] "RemoveContainer" containerID="6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25" Mar 10 00:08:43 crc kubenswrapper[4906]: E0310 00:08:43.577551 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hskrb_openshift-ovn-kubernetes(c9f87520-6105-4b6f-ba5a-a232b5dc24c0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.576280 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:44 crc kubenswrapper[4906]: E0310 00:08:44.576469 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.576523 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:44 crc kubenswrapper[4906]: E0310 00:08:44.576748 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.577162 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:44 crc kubenswrapper[4906]: E0310 00:08:44.577507 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.597423 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.612543 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.636904 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.659561 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fabe7c81-1aa3-4cec-a99f-4697dbe6921d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab4356524a9cb048a66600ffecef81aa5f9f442b722d6780409970b0fe775d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:16.678293 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:16.681109 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:16.724153 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:16.731026 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:44.062582 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:44.062735 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd01291605de2fa2a88cfaff323d3a654acbaff0550f4faa872560a04ac582b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3090d401f5153e72de6163581cf256a48f9e2aed39a2a1ab1911f0322fc2fd4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.681563 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: E0310 00:08:44.694966 4906 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.703632 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bn7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d944ce-605e-41a7-9211-a5bc388145f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bn7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.723478 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.742349 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.760951 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f18ec986-2e62-4a63-b0b1-8780eac64def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc46d6cc757c0d334c48e6458543df7d62ae0209adab82b8e8f902b1c686a187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd1becf60e90db5a68bbcd8dc00cf672fd5b9f329aa84dd8a798060e8bb385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5zmkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.793217 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"topping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:26.647077 7154 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:26.647129 7154 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:26.647139 7154 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:26.647206 7154 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 00:08:26.647203 7154 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:26.647239 7154 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:26.647247 7154 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:26.647268 7154 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:26.647250 7154 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:26.647272 7154 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:26.647315 7154 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:26.647374 7154 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:26.647426 7154 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:26.647475 7154 factory.go:656] Stopping watch factory\\\\nI0310 00:08:26.647502 7154 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hskrb_openshift-ovn-kubernetes(c9f87520-6105-4b6f-ba5a-a232b5dc24c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.808700 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.834119 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.852262 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.874380 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.897477 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.923577 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.939294 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bba372c-755f-4ff1-b1e4-2ccb6079d5a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0eb883e8f9f1bea4366cef5df73fd7a58ca339c0a3460cfccb92085380013dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://159a9ac970d599fe117be8d5340e4ab61b723d5765dc14fca8834807c38ead55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eca935bc2a1af8e0157cb652fecdf524d319c037423de917084bc258eb00317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.963836 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:44 crc kubenswrapper[4906]: I0310 00:08:44.986346 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:44Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:45 crc kubenswrapper[4906]: I0310 00:08:45.342805 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs\") pod \"network-metrics-daemon-5bn7b\" (UID: \"55d944ce-605e-41a7-9211-a5bc388145f1\") " pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:45 crc kubenswrapper[4906]: E0310 00:08:45.343086 4906 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:45 crc kubenswrapper[4906]: E0310 00:08:45.343285 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs podName:55d944ce-605e-41a7-9211-a5bc388145f1 nodeName:}" failed. No retries permitted until 2026-03-10 00:09:17.343243339 +0000 UTC m=+183.491138571 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs") pod "network-metrics-daemon-5bn7b" (UID: "55d944ce-605e-41a7-9211-a5bc388145f1") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 00:08:45 crc kubenswrapper[4906]: I0310 00:08:45.576747 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:45 crc kubenswrapper[4906]: E0310 00:08:45.576966 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:46 crc kubenswrapper[4906]: I0310 00:08:46.576674 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:46 crc kubenswrapper[4906]: I0310 00:08:46.576826 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:46 crc kubenswrapper[4906]: I0310 00:08:46.576841 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:46 crc kubenswrapper[4906]: E0310 00:08:46.577002 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:46 crc kubenswrapper[4906]: E0310 00:08:46.577154 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:46 crc kubenswrapper[4906]: E0310 00:08:46.577311 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.181022 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.181174 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.181197 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.181228 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.181250 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:47Z","lastTransitionTime":"2026-03-10T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:47 crc kubenswrapper[4906]: E0310 00:08:47.204767 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.211240 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.211324 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.211344 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.211376 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.211575 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:47Z","lastTransitionTime":"2026-03-10T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:47 crc kubenswrapper[4906]: E0310 00:08:47.234300 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.239463 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.239560 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.239579 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.239608 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.239627 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:47Z","lastTransitionTime":"2026-03-10T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:47 crc kubenswrapper[4906]: E0310 00:08:47.258743 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.263124 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.263153 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.263163 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.263179 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.263189 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:47Z","lastTransitionTime":"2026-03-10T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:47 crc kubenswrapper[4906]: E0310 00:08:47.278188 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.282726 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.282757 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.282768 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.282781 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.282791 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:47Z","lastTransitionTime":"2026-03-10T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:47 crc kubenswrapper[4906]: E0310 00:08:47.296224 4906 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a60179e-98a5-4d7f-9dd0-5aef84f37492\\\",\\\"systemUUID\\\":\\\"5a964b87-c4f2-4bba-95e1-b2c12e6316ae\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4906]: E0310 00:08:47.296373 4906 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.576711 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:47 crc kubenswrapper[4906]: E0310 00:08:47.576934 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.739023 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-85dv2_0c494c18-0d46-4e23-8ef5-214938a66a7b/kube-multus/0.log" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.739112 4906 generic.go:334] "Generic (PLEG): container finished" podID="0c494c18-0d46-4e23-8ef5-214938a66a7b" containerID="7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a" exitCode=1 Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.739162 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-85dv2" event={"ID":"0c494c18-0d46-4e23-8ef5-214938a66a7b","Type":"ContainerDied","Data":"7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a"} Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.739828 4906 scope.go:117] "RemoveContainer" containerID="7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.767246 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.788314 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fabe7c81-1aa3-4cec-a99f-4697dbe6921d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab4356524a9cb048a66600ffecef81aa5f9f442b722d6780409970b0fe775d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:16.678293 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:16.681109 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:16.724153 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:16.731026 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:44.062582 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:44.062735 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd01291605de2fa2a88cfaff323d3a654acbaff0550f4faa872560a04ac582b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3090d401f5153e72de6163581cf256a48f9e2aed39a2a1ab1911f0322fc2fd4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.811283 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.829345 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.847887 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.868757 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.891368 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"2026-03-10T00:08:01+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a8a94bab-6dfc-4bf9-93d7-b8a473dafdde\\\\n2026-03-10T00:08:01+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a8a94bab-6dfc-4bf9-93d7-b8a473dafdde to /host/opt/cni/bin/\\\\n2026-03-10T00:08:02Z [verbose] multus-daemon started\\\\n2026-03-10T00:08:02Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.912540 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f18ec986-2e62-4a63-b0b1-8780eac64def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc46d6cc757c0d334c48e6458543df7d62ae0209adab82b8e8f902b1c686a187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd1becf60e90db5a68bbcd8dc00cf672fd5b9f329aa84dd8a798060e8bb385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5zmkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.931601 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bn7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d944ce-605e-41a7-9211-a5bc388145f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bn7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.948057 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.970838 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:47 crc kubenswrapper[4906]: I0310 00:08:47.990377 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.019024 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"topping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:26.647077 7154 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:26.647129 7154 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:26.647139 7154 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:26.647206 7154 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 00:08:26.647203 7154 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:26.647239 7154 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:26.647247 7154 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:26.647268 7154 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:26.647250 7154 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:26.647272 7154 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:26.647315 7154 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:26.647374 7154 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:26.647426 7154 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:26.647475 7154 factory.go:656] Stopping watch factory\\\\nI0310 00:08:26.647502 7154 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hskrb_openshift-ovn-kubernetes(c9f87520-6105-4b6f-ba5a-a232b5dc24c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.046089 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.065250 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bba372c-755f-4ff1-b1e4-2ccb6079d5a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0eb883e8f9f1bea4366cef5df73fd7a58ca339c0a3460cfccb92085380013dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://159a9ac970d599fe117be8d5340e4ab61b723d5765dc14fca8834807c38ead55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eca935bc2a1af8e0157cb652fecdf524d319c037423de917084bc258eb00317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.091868 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.109469 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.129120 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.152232 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.575920 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.575988 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.575920 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:48 crc kubenswrapper[4906]: E0310 00:08:48.576129 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:48 crc kubenswrapper[4906]: E0310 00:08:48.576354 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:48 crc kubenswrapper[4906]: E0310 00:08:48.576505 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.747476 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-85dv2_0c494c18-0d46-4e23-8ef5-214938a66a7b/kube-multus/0.log" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.747599 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-85dv2" event={"ID":"0c494c18-0d46-4e23-8ef5-214938a66a7b","Type":"ContainerStarted","Data":"97deba93385ce59ea5f63333b0faca22d02265e44d69bbfbb0df409c4f16bef1"} Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.767314 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d8681864-d447-44e6-9cf5-6408d49ec58a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4aeb4d2ad9fadbae974054da89166076b81a944e08f5475f1a9382a48cdc623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69434cbd9c9d4490c9cfc53d499f27748624f9709ca6f469702fc69374e1b069\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.790510 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16c37b204d34ae2c304ca12b8978e189ce99a6c9352c9f68e01d074a8fccc1d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.811005 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bd57" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a677000-e7dd-40fa-ad3e-94c061293e29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dedc89ecfde3aab4fefadbd750541c4fe076777f8e0583ab87c3a33ad2d3b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6hdfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:07:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bd57\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.845777 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:26Z\\\",\\\"message\\\":\\\"topping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0310 00:08:26.647077 7154 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 00:08:26.647129 7154 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 00:08:26.647139 7154 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 00:08:26.647206 7154 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0310 00:08:26.647203 7154 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0310 00:08:26.647239 7154 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 00:08:26.647247 7154 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 00:08:26.647268 7154 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 00:08:26.647250 7154 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 00:08:26.647272 7154 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0310 00:08:26.647315 7154 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 00:08:26.647374 7154 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0310 00:08:26.647426 7154 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 00:08:26.647475 7154 factory.go:656] Stopping watch factory\\\\nI0310 00:08:26.647502 7154 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-hskrb_openshift-ovn-kubernetes(c9f87520-6105-4b6f-ba5a-a232b5dc24c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbtkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-hskrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.868869 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.897886 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e902b2c-8bf9-49e8-9820-392f34dbfb10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://245f709701748040cb586efcba214ea87e3834f74aec843887bf0a47996d71ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://461ade02a7f0df808708774e4e7ac0d3e416bbc56988253f3f7741b2403f411e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2137af85643edb4160cec0443c914a41c18cde0c48615d328ef82c9cb5976685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13423588b4cb100ba7f85054546fccc23f287f313cbd61c4b15ae75e9fd7135a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c38493d85c4cafaa4905f7ce5ad87a912694bd749ea8f13483b3cdae026d357\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0d67319708302cc33fdc4484ac7ef77101029f60a4733f0f10f7ebb0bbbd86d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://860be515a0723147ee78d8b9e1bc3d3fff9675739ced0812fc9207df8a481fec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:08:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrdxj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jgfpc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.918599 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7bba372c-755f-4ff1-b1e4-2ccb6079d5a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0eb883e8f9f1bea4366cef5df73fd7a58ca339c0a3460cfccb92085380013dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://159a9ac970d599fe117be8d5340e4ab61b723d5765dc14fca8834807c38ead55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eca935bc2a1af8e0157cb652fecdf524d319c037423de917084bc258eb00317\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58522e059388d727d4f813a5438c633d60317914d9cc8c3f01eec29c064e8b15\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.956092 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1c5481b-1561-42af-9126-977132b7ced2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e8b43acbd8d86bc6523a3304162f30fcf4068197c4a56018dc56e1c8ae0bcd4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3effb7d619bec3fd1ef64a418efe1e3503cb2dd187ce2188911a90b4034d331d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad3cc19888cbd84c556e52eda6a3be770768b7316982d6442717f6aee62191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecd888b723dd7b84ded4f27c69d82e2b203fec96dbaab8198df43bf0806b34ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9e5dc928d14830f187866775b2cd95c46cff2d7749025b36d16f6489662bafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c393003b225bcbb2cdb8e3964c2f8e7a03213daa6628470b237398efc76037a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://726be2e042e25b5dcd575a0d14b0b79dca031be09dab3e814e16359fadc6495b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca870bd09b0c1faf8e2a4e28589c1dc2ba677bf20e01bbfd9b3fd15739e8463\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:48 crc kubenswrapper[4906]: I0310 00:08:48.979174 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4906]: I0310 00:08:49.002789 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4906]: I0310 00:08:49.021305 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vwftl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c755e546-15de-4347-bc90-03a4ea362583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ee4d840d8905aa3be1765cf7b77cb43f17139b794d7e9760b680ee47bb8d909\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj5d7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:06Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vwftl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4906]: I0310 00:08:49.044773 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0310 00:06:54.228491 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0310 00:06:54.228657 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 00:06:54.229423 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1709734890/tls.crt::/tmp/serving-cert-1709734890/tls.key\\\\\\\"\\\\nI0310 00:06:54.538287 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0310 00:06:54.541181 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0310 00:06:54.541202 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0310 00:06:54.541225 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0310 00:06:54.541230 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0310 00:06:54.549119 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0310 00:06:54.549178 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549189 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0310 00:06:54.549199 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0310 00:06:54.549207 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0310 00:06:54.549213 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0310 00:06:54.549219 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0310 00:06:54.549128 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0310 00:06:54.551727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T00:06:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4906]: I0310 00:08:49.066217 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fabe7c81-1aa3-4cec-a99f-4697dbe6921d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dab4356524a9cb048a66600ffecef81aa5f9f442b722d6780409970b0fe775d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b037feadaa2542ab3d4b67a5ced4e73ab0764d2c739c7ab3df075007346e16fe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T00:06:44Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 00:06:16.678293 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 00:06:16.681109 1 observer_polling.go:159] Starting file observer\\\\nI0310 00:06:16.724153 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 00:06:16.731026 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 00:06:44.062582 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 00:06:44.062735 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acd01291605de2fa2a88cfaff323d3a654acbaff0550f4faa872560a04ac582b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3090d401f5153e72de6163581cf256a48f9e2aed39a2a1ab1911f0322fc2fd4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4906]: I0310 00:08:49.090405 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72569fcc50f473ea417eb7403f0901fc02d50a886ea39d62b2bb02682bc64dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ed04015b5fffd0db47db60c0aec762bd61d476c7cc3c0e726a5bd5de43baef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4906]: I0310 00:08:49.109507 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72d61d35-0a64-45a5-8df3-9c429727deba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da01514bb52ced358e921f135d8e59cb221cf1c2c633fb9ee2b9e3d90ef0ffce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5ccf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bxtw4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4906]: I0310 00:08:49.132203 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T00:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://161718aee1b97751dccdff3689a0dc26eb0efb1842899be5bb4994598adb36db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4906]: I0310 00:08:49.153763 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-85dv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c494c18-0d46-4e23-8ef5-214938a66a7b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://97deba93385ce59ea5f63333b0faca22d02265e44d69bbfbb0df409c4f16bef1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T00:08:47Z\\\",\\\"message\\\":\\\"2026-03-10T00:08:01+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a8a94bab-6dfc-4bf9-93d7-b8a473dafdde\\\\n2026-03-10T00:08:01+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a8a94bab-6dfc-4bf9-93d7-b8a473dafdde to /host/opt/cni/bin/\\\\n2026-03-10T00:08:02Z [verbose] multus-daemon started\\\\n2026-03-10T00:08:02Z [verbose] Readiness Indicator file check\\\\n2026-03-10T00:08:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T00:08:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftt77\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-85dv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4906]: I0310 00:08:49.169410 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f18ec986-2e62-4a63-b0b1-8780eac64def\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc46d6cc757c0d334c48e6458543df7d62ae0209adab82b8e8f902b1c686a187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebd1becf60e90db5a68bbcd8dc00cf672fd5b9f329aa84dd8a798060e8bb385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bwjjq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-5zmkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4906]: I0310 00:08:49.183273 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-5bn7b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d944ce-605e-41a7-9211-a5bc388145f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T00:08:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4pvg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T00:08:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-5bn7b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 10 00:08:49 crc kubenswrapper[4906]: I0310 00:08:49.576443 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:49 crc kubenswrapper[4906]: E0310 00:08:49.576754 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:49 crc kubenswrapper[4906]: E0310 00:08:49.696211 4906 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:50 crc kubenswrapper[4906]: I0310 00:08:50.575960 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:50 crc kubenswrapper[4906]: I0310 00:08:50.576023 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:50 crc kubenswrapper[4906]: E0310 00:08:50.576275 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:50 crc kubenswrapper[4906]: I0310 00:08:50.576055 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:50 crc kubenswrapper[4906]: E0310 00:08:50.576427 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:50 crc kubenswrapper[4906]: E0310 00:08:50.576597 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:51 crc kubenswrapper[4906]: I0310 00:08:51.576624 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:51 crc kubenswrapper[4906]: E0310 00:08:51.576911 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:52 crc kubenswrapper[4906]: I0310 00:08:52.576316 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:52 crc kubenswrapper[4906]: E0310 00:08:52.576508 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:52 crc kubenswrapper[4906]: I0310 00:08:52.576686 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:52 crc kubenswrapper[4906]: E0310 00:08:52.576822 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:52 crc kubenswrapper[4906]: I0310 00:08:52.576941 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:52 crc kubenswrapper[4906]: E0310 00:08:52.577274 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:53 crc kubenswrapper[4906]: I0310 00:08:53.576794 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:53 crc kubenswrapper[4906]: E0310 00:08:53.577835 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:54 crc kubenswrapper[4906]: I0310 00:08:54.576218 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:54 crc kubenswrapper[4906]: I0310 00:08:54.576342 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:54 crc kubenswrapper[4906]: I0310 00:08:54.576937 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:54 crc kubenswrapper[4906]: E0310 00:08:54.577182 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:54 crc kubenswrapper[4906]: E0310 00:08:54.577436 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:54 crc kubenswrapper[4906]: E0310 00:08:54.578052 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:54 crc kubenswrapper[4906]: I0310 00:08:54.650549 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=68.650517486 podStartE2EDuration="1m8.650517486s" podCreationTimestamp="2026-03-10 00:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:08:54.650496955 +0000 UTC m=+160.798392097" watchObservedRunningTime="2026-03-10 00:08:54.650517486 +0000 UTC m=+160.798412638" Mar 10 00:08:54 crc kubenswrapper[4906]: I0310 00:08:54.694536 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9bd57" podStartSLOduration=106.69451116 podStartE2EDuration="1m46.69451116s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:08:54.682069408 +0000 UTC m=+160.829964530" watchObservedRunningTime="2026-03-10 00:08:54.69451116 +0000 UTC m=+160.842406282" Mar 10 00:08:54 crc kubenswrapper[4906]: E0310 00:08:54.697325 4906 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 00:08:54 crc kubenswrapper[4906]: I0310 00:08:54.745619 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jgfpc" podStartSLOduration=106.745585084 podStartE2EDuration="1m46.745585084s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:08:54.728968514 +0000 UTC m=+160.876863626" watchObservedRunningTime="2026-03-10 00:08:54.745585084 +0000 UTC m=+160.893480236" Mar 10 00:08:54 crc kubenswrapper[4906]: I0310 00:08:54.746468 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=34.746457669 podStartE2EDuration="34.746457669s" podCreationTimestamp="2026-03-10 00:08:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:08:54.745132321 +0000 UTC m=+160.893027443" watchObservedRunningTime="2026-03-10 00:08:54.746457669 +0000 UTC m=+160.894352821" Mar 10 00:08:54 crc kubenswrapper[4906]: I0310 00:08:54.782074 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=56.782045195 podStartE2EDuration="56.782045195s" podCreationTimestamp="2026-03-10 00:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:08:54.779368049 +0000 UTC m=+160.927263181" watchObservedRunningTime="2026-03-10 00:08:54.782045195 +0000 UTC m=+160.929940347" Mar 10 00:08:54 crc kubenswrapper[4906]: I0310 00:08:54.827256 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podStartSLOduration=106.827224472 podStartE2EDuration="1m46.827224472s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:08:54.811775675 +0000 UTC m=+160.959670827" watchObservedRunningTime="2026-03-10 00:08:54.827224472 +0000 UTC m=+160.975119624" Mar 10 00:08:54 crc kubenswrapper[4906]: I0310 00:08:54.828513 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vwftl" podStartSLOduration=106.828500558 podStartE2EDuration="1m46.828500558s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:08:54.828378625 +0000 UTC m=+160.976273757" watchObservedRunningTime="2026-03-10 00:08:54.828500558 +0000 UTC m=+160.976395710" Mar 10 00:08:54 crc kubenswrapper[4906]: I0310 00:08:54.925862 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=90.92584145 podStartE2EDuration="1m30.92584145s" podCreationTimestamp="2026-03-10 00:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:08:54.925250674 +0000 UTC m=+161.073145796" watchObservedRunningTime="2026-03-10 00:08:54.92584145 +0000 UTC m=+161.073736562" Mar 10 00:08:54 crc kubenswrapper[4906]: I0310 00:08:54.946564 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=45.946540006 podStartE2EDuration="45.946540006s" podCreationTimestamp="2026-03-10 00:08:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:08:54.945895428 +0000 UTC m=+161.093790540" watchObservedRunningTime="2026-03-10 00:08:54.946540006 +0000 UTC m=+161.094435128" Mar 10 00:08:55 crc kubenswrapper[4906]: I0310 00:08:55.035889 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-5zmkq" podStartSLOduration=106.035849771 podStartE2EDuration="1m46.035849771s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:08:55.035694387 +0000 UTC m=+161.183589499" watchObservedRunningTime="2026-03-10 00:08:55.035849771 +0000 UTC m=+161.183744923" Mar 10 00:08:55 crc kubenswrapper[4906]: I0310 00:08:55.036743 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-85dv2" podStartSLOduration=107.036733566 podStartE2EDuration="1m47.036733566s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:08:55.020856717 +0000 UTC m=+161.168751829" watchObservedRunningTime="2026-03-10 00:08:55.036733566 +0000 UTC m=+161.184628718" Mar 10 00:08:55 crc kubenswrapper[4906]: I0310 00:08:55.576176 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:55 crc kubenswrapper[4906]: E0310 00:08:55.576941 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:55 crc kubenswrapper[4906]: I0310 00:08:55.577592 4906 scope.go:117] "RemoveContainer" containerID="6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25" Mar 10 00:08:55 crc kubenswrapper[4906]: I0310 00:08:55.776441 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hskrb_c9f87520-6105-4b6f-ba5a-a232b5dc24c0/ovnkube-controller/2.log" Mar 10 00:08:55 crc kubenswrapper[4906]: I0310 00:08:55.779578 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerStarted","Data":"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda"} Mar 10 00:08:55 crc kubenswrapper[4906]: I0310 00:08:55.781069 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:08:55 crc kubenswrapper[4906]: I0310 00:08:55.820736 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" podStartSLOduration=107.820707631 podStartE2EDuration="1m47.820707631s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:08:55.819556859 +0000 UTC m=+161.967451971" watchObservedRunningTime="2026-03-10 00:08:55.820707631 +0000 UTC m=+161.968602753" Mar 10 00:08:56 crc kubenswrapper[4906]: I0310 00:08:56.466920 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5bn7b"] Mar 10 00:08:56 crc kubenswrapper[4906]: I0310 00:08:56.467085 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:56 crc kubenswrapper[4906]: E0310 00:08:56.467237 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:56 crc kubenswrapper[4906]: I0310 00:08:56.576910 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:56 crc kubenswrapper[4906]: E0310 00:08:56.577153 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:56 crc kubenswrapper[4906]: I0310 00:08:56.577542 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:56 crc kubenswrapper[4906]: E0310 00:08:56.577709 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:56 crc kubenswrapper[4906]: I0310 00:08:56.578002 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:56 crc kubenswrapper[4906]: E0310 00:08:56.578204 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.556894 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.556959 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.556976 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.557004 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.557024 4906 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T00:08:57Z","lastTransitionTime":"2026-03-10T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.629595 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl"] Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.630280 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.632586 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.633987 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.635233 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.635613 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.714922 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4360df9a-8b89-4393-a18d-6cf811b73a93-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q74bl\" (UID: \"4360df9a-8b89-4393-a18d-6cf811b73a93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.715010 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4360df9a-8b89-4393-a18d-6cf811b73a93-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q74bl\" (UID: \"4360df9a-8b89-4393-a18d-6cf811b73a93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.715133 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4360df9a-8b89-4393-a18d-6cf811b73a93-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q74bl\" (UID: \"4360df9a-8b89-4393-a18d-6cf811b73a93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.715270 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4360df9a-8b89-4393-a18d-6cf811b73a93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q74bl\" (UID: \"4360df9a-8b89-4393-a18d-6cf811b73a93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.715543 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4360df9a-8b89-4393-a18d-6cf811b73a93-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q74bl\" (UID: \"4360df9a-8b89-4393-a18d-6cf811b73a93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.816546 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4360df9a-8b89-4393-a18d-6cf811b73a93-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q74bl\" (UID: \"4360df9a-8b89-4393-a18d-6cf811b73a93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.816613 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4360df9a-8b89-4393-a18d-6cf811b73a93-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q74bl\" (UID: \"4360df9a-8b89-4393-a18d-6cf811b73a93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.816679 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4360df9a-8b89-4393-a18d-6cf811b73a93-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q74bl\" (UID: \"4360df9a-8b89-4393-a18d-6cf811b73a93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.816729 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4360df9a-8b89-4393-a18d-6cf811b73a93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q74bl\" (UID: \"4360df9a-8b89-4393-a18d-6cf811b73a93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.816838 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4360df9a-8b89-4393-a18d-6cf811b73a93-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q74bl\" (UID: \"4360df9a-8b89-4393-a18d-6cf811b73a93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.816921 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4360df9a-8b89-4393-a18d-6cf811b73a93-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-q74bl\" (UID: \"4360df9a-8b89-4393-a18d-6cf811b73a93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.817067 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4360df9a-8b89-4393-a18d-6cf811b73a93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-q74bl\" (UID: \"4360df9a-8b89-4393-a18d-6cf811b73a93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.818554 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4360df9a-8b89-4393-a18d-6cf811b73a93-service-ca\") pod \"cluster-version-operator-5c965bbfc6-q74bl\" (UID: \"4360df9a-8b89-4393-a18d-6cf811b73a93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.828631 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4360df9a-8b89-4393-a18d-6cf811b73a93-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-q74bl\" (UID: \"4360df9a-8b89-4393-a18d-6cf811b73a93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.848118 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4360df9a-8b89-4393-a18d-6cf811b73a93-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-q74bl\" (UID: \"4360df9a-8b89-4393-a18d-6cf811b73a93\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" Mar 10 00:08:57 crc kubenswrapper[4906]: I0310 00:08:57.962178 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" Mar 10 00:08:57 crc kubenswrapper[4906]: W0310 00:08:57.989337 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4360df9a_8b89_4393_a18d_6cf811b73a93.slice/crio-d49fa116094a6b35b095361e556e7dcfcf85c59b766724aadc611aca48292acc WatchSource:0}: Error finding container d49fa116094a6b35b095361e556e7dcfcf85c59b766724aadc611aca48292acc: Status 404 returned error can't find the container with id d49fa116094a6b35b095361e556e7dcfcf85c59b766724aadc611aca48292acc Mar 10 00:08:58 crc kubenswrapper[4906]: I0310 00:08:58.546517 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 10 00:08:58 crc kubenswrapper[4906]: I0310 00:08:58.560574 4906 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 00:08:58 crc kubenswrapper[4906]: I0310 00:08:58.576236 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:08:58 crc kubenswrapper[4906]: E0310 00:08:58.576425 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5bn7b" podUID="55d944ce-605e-41a7-9211-a5bc388145f1" Mar 10 00:08:58 crc kubenswrapper[4906]: I0310 00:08:58.576667 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:08:58 crc kubenswrapper[4906]: E0310 00:08:58.576756 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 00:08:58 crc kubenswrapper[4906]: I0310 00:08:58.576810 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:08:58 crc kubenswrapper[4906]: I0310 00:08:58.576931 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:08:58 crc kubenswrapper[4906]: E0310 00:08:58.576959 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 00:08:58 crc kubenswrapper[4906]: E0310 00:08:58.577152 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 00:08:58 crc kubenswrapper[4906]: I0310 00:08:58.795114 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" event={"ID":"4360df9a-8b89-4393-a18d-6cf811b73a93","Type":"ContainerStarted","Data":"d8252322f01a1af711b745a10b7d284af1eb904b902cded728dd141ffdb3280d"} Mar 10 00:08:58 crc kubenswrapper[4906]: I0310 00:08:58.795202 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" event={"ID":"4360df9a-8b89-4393-a18d-6cf811b73a93","Type":"ContainerStarted","Data":"d49fa116094a6b35b095361e556e7dcfcf85c59b766724aadc611aca48292acc"} Mar 10 00:08:58 crc kubenswrapper[4906]: I0310 00:08:58.819861 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-q74bl" podStartSLOduration=110.81982639 podStartE2EDuration="1m50.81982639s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:08:58.818737539 +0000 UTC m=+164.966632681" watchObservedRunningTime="2026-03-10 00:08:58.81982639 +0000 UTC m=+164.967721532" Mar 10 00:09:00 crc kubenswrapper[4906]: I0310 00:09:00.575953 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:09:00 crc kubenswrapper[4906]: I0310 00:09:00.575992 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:00 crc kubenswrapper[4906]: I0310 00:09:00.576094 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:00 crc kubenswrapper[4906]: I0310 00:09:00.578117 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:00 crc kubenswrapper[4906]: I0310 00:09:00.580588 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 00:09:00 crc kubenswrapper[4906]: I0310 00:09:00.580589 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 00:09:00 crc kubenswrapper[4906]: I0310 00:09:00.581945 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 00:09:00 crc kubenswrapper[4906]: I0310 00:09:00.582208 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 00:09:00 crc kubenswrapper[4906]: I0310 00:09:00.582248 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 00:09:00 crc kubenswrapper[4906]: I0310 00:09:00.582544 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 00:09:00 crc kubenswrapper[4906]: I0310 00:09:00.887682 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.650054 4906 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.712256 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t5bxf"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.713192 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.725026 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.725137 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.725187 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b492s"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.725281 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.726063 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.726230 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.726236 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b492s" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.726476 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.726768 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.727061 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kzklf"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.727122 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.727509 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.727544 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.728373 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kzklf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.731847 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.732897 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.733601 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.733987 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.735265 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v25gg"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.735960 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.738863 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lrj5w"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.739460 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.743275 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-x7t4l"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.743938 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-x7t4l" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.752676 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.752737 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.752915 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.753302 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.753660 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.753717 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.755108 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mq564"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.765005 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.767584 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.776705 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.777567 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.778109 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.778596 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.778694 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.778748 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.779030 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.779207 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.781757 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.783898 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.797519 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.797669 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.798069 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.799180 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.799423 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.799582 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.800173 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.800538 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.800617 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.800787 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.800915 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.800975 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.801018 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.801115 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.801144 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.801243 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.801256 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.801352 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.801355 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.802186 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.801410 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.802758 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.802860 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.804685 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.806151 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ltqv2"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.806629 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.806833 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.806862 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.806987 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-q9zx6"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.807066 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.807107 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.807189 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.807362 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.807477 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.807538 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.807784 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.808163 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.810028 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-zkqrr"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.810271 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.810565 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zkqrr" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.818050 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.819849 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.820082 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.820207 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.820549 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.820677 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29551680-hclh4"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.821248 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxgxz"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.821583 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxgxz" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.821959 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29551680-hclh4" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.822117 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qfc95"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.822667 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.824803 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.825356 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.832191 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.832678 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.833018 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.833072 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.833161 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.833505 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.833703 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.833766 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.835078 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qnppg"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.835789 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qnppg" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.848456 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.848770 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.848925 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.849077 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.849229 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.849455 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.849656 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.849894 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.850119 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.855567 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-66jxp"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.856760 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.857205 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.857373 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.857729 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b062963a-a86b-43bb-ba19-c49fa8ac7cf1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-b492s\" (UID: \"b062963a-a86b-43bb-ba19-c49fa8ac7cf1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b492s" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.857798 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9f070bb-41a2-458e-9818-063ffb52008a-serving-cert\") pod \"console-operator-58897d9998-x7t4l\" (UID: \"c9f070bb-41a2-458e-9818-063ffb52008a\") " pod="openshift-console-operator/console-operator-58897d9998-x7t4l" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.857847 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwhqv\" (UniqueName: \"kubernetes.io/projected/b062963a-a86b-43bb-ba19-c49fa8ac7cf1-kube-api-access-dwhqv\") pod \"openshift-apiserver-operator-796bbdcf4f-b492s\" (UID: \"b062963a-a86b-43bb-ba19-c49fa8ac7cf1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b492s" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.858000 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7ef57cce-ba04-4605-a245-edfad55f6f69-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.858032 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2a24c21-ab84-4989-a154-b8c9118c31bf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lrj5w\" (UID: \"e2a24c21-ab84-4989-a154-b8c9118c31bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.858312 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.858437 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1401406-9964-4c43-8192-efd2f32732e5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.858470 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skcn8\" (UniqueName: \"kubernetes.io/projected/c9f070bb-41a2-458e-9818-063ffb52008a-kube-api-access-skcn8\") pod \"console-operator-58897d9998-x7t4l\" (UID: \"c9f070bb-41a2-458e-9818-063ffb52008a\") " pod="openshift-console-operator/console-operator-58897d9998-x7t4l" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.858712 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1401406-9964-4c43-8192-efd2f32732e5-node-pullsecrets\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.858757 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f070bb-41a2-458e-9818-063ffb52008a-config\") pod \"console-operator-58897d9998-x7t4l\" (UID: \"c9f070bb-41a2-458e-9818-063ffb52008a\") " pod="openshift-console-operator/console-operator-58897d9998-x7t4l" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.858795 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f1401406-9964-4c43-8192-efd2f32732e5-etcd-client\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.858890 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8ks8\" (UniqueName: \"kubernetes.io/projected/f1401406-9964-4c43-8192-efd2f32732e5-kube-api-access-j8ks8\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.858930 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.858986 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-config\") pod \"controller-manager-879f6c89f-v25gg\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859017 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1401406-9964-4c43-8192-efd2f32732e5-serving-cert\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859066 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ef57cce-ba04-4605-a245-edfad55f6f69-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859103 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7ef57cce-ba04-4605-a245-edfad55f6f69-encryption-config\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859192 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f1401406-9964-4c43-8192-efd2f32732e5-encryption-config\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859219 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b95b69-1fe1-4e7c-9499-ca59220ca7eb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kzklf\" (UID: \"39b95b69-1fe1-4e7c-9499-ca59220ca7eb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kzklf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859516 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859554 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v25gg\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859579 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ef57cce-ba04-4605-a245-edfad55f6f69-serving-cert\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859611 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm925\" (UniqueName: \"kubernetes.io/projected/7ef57cce-ba04-4605-a245-edfad55f6f69-kube-api-access-pm925\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859671 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7ef57cce-ba04-4605-a245-edfad55f6f69-etcd-client\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859720 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjdx4\" (UniqueName: \"kubernetes.io/projected/4b4509ca-5d20-4f5c-89ea-a910f792ff82-kube-api-access-vjdx4\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859748 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859799 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b062963a-a86b-43bb-ba19-c49fa8ac7cf1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-b492s\" (UID: \"b062963a-a86b-43bb-ba19-c49fa8ac7cf1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b492s" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859831 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fxxr\" (UniqueName: \"kubernetes.io/projected/e2a24c21-ab84-4989-a154-b8c9118c31bf-kube-api-access-6fxxr\") pod \"machine-api-operator-5694c8668f-lrj5w\" (UID: \"e2a24c21-ab84-4989-a154-b8c9118c31bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859877 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1401406-9964-4c43-8192-efd2f32732e5-audit-dir\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859903 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7ef57cce-ba04-4605-a245-edfad55f6f69-audit-dir\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859928 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b4509ca-5d20-4f5c-89ea-a910f792ff82-audit-dir\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859972 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f1401406-9964-4c43-8192-efd2f32732e5-etcd-serving-ca\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859993 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e2a24c21-ab84-4989-a154-b8c9118c31bf-images\") pod \"machine-api-operator-5694c8668f-lrj5w\" (UID: \"e2a24c21-ab84-4989-a154-b8c9118c31bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.860042 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.860070 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.860126 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.860163 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-audit-policies\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.860296 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.860324 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85226c05-ab07-4243-89af-c58b7c3d1f43-serving-cert\") pod \"controller-manager-879f6c89f-v25gg\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.860450 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.860482 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9f070bb-41a2-458e-9818-063ffb52008a-trusted-ca\") pod \"console-operator-58897d9998-x7t4l\" (UID: \"c9f070bb-41a2-458e-9818-063ffb52008a\") " pod="openshift-console-operator/console-operator-58897d9998-x7t4l" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.860541 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.860621 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1401406-9964-4c43-8192-efd2f32732e5-config\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.860679 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.857752 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.858402 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859580 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.863289 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.859659 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.860089 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.861196 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.861999 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.865755 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.866625 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.867366 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-client-ca\") pod \"controller-manager-879f6c89f-v25gg\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.867454 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2a24c21-ab84-4989-a154-b8c9118c31bf-config\") pod \"machine-api-operator-5694c8668f-lrj5w\" (UID: \"e2a24c21-ab84-4989-a154-b8c9118c31bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.867503 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f1401406-9964-4c43-8192-efd2f32732e5-audit\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.867570 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7ef57cce-ba04-4605-a245-edfad55f6f69-audit-policies\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.867603 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqjff\" (UniqueName: \"kubernetes.io/projected/39b95b69-1fe1-4e7c-9499-ca59220ca7eb-kube-api-access-nqjff\") pod \"cluster-samples-operator-665b6dd947-kzklf\" (UID: \"39b95b69-1fe1-4e7c-9499-ca59220ca7eb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kzklf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.867730 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.867863 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f1401406-9964-4c43-8192-efd2f32732e5-image-import-ca\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.867952 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5h2s\" (UniqueName: \"kubernetes.io/projected/85226c05-ab07-4243-89af-c58b7c3d1f43-kube-api-access-f5h2s\") pod \"controller-manager-879f6c89f-v25gg\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.880228 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.882413 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.885768 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-kpmwl"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.886448 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-427rj"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.886873 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ldnm9"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.887438 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ldnm9" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.887850 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.888133 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-427rj" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.889329 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.889893 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.890979 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.891205 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.891405 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.891516 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.891602 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.895579 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.896139 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.896451 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gtbds"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.896973 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gtbds" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.897478 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.897658 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.898768 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.899085 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.899298 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.900789 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xd62p"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.900839 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.901141 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.901450 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.902001 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vw9rg"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.902203 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xd62p" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.903483 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nkxsv"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.903569 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.904817 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.907285 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kfw9l"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.906717 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.907340 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nkxsv" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.908079 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kfw9l" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.908453 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-krjqb"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.909575 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-krjqb" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.909970 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b492s"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.911372 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.912275 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.916932 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.919078 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.919694 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.926035 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgvms"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.927628 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t5bxf"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.927835 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgvms" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.928702 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.930344 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.930914 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.931109 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.934197 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fvmnw"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.935070 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fvmnw" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.958203 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.960540 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kzklf"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.960703 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xzvmk"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.961849 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551688-fkkqj"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.968903 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zkqrr"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.969865 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551688-fkkqj" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.969995 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988446 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2279b4a5-d4f2-455a-87e2-0774d6d57c40-config\") pod \"authentication-operator-69f744f599-ltqv2\" (UID: \"2279b4a5-d4f2-455a-87e2-0774d6d57c40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988513 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1401406-9964-4c43-8192-efd2f32732e5-config\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988552 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988578 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-client-ca\") pod \"controller-manager-879f6c89f-v25gg\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988599 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2a24c21-ab84-4989-a154-b8c9118c31bf-config\") pod \"machine-api-operator-5694c8668f-lrj5w\" (UID: \"e2a24c21-ab84-4989-a154-b8c9118c31bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988620 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de2ea5bf-12b8-4ab0-a073-53df41c9646a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-427rj\" (UID: \"de2ea5bf-12b8-4ab0-a073-53df41c9646a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-427rj" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988657 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f1401406-9964-4c43-8192-efd2f32732e5-audit\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988682 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/66a5f13c-92bf-4e0b-ac5e-f14de7c6f72a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xd62p\" (UID: \"66a5f13c-92bf-4e0b-ac5e-f14de7c6f72a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xd62p" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988708 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebf8d4f5-6995-45dd-be49-491ced904443-config\") pod \"route-controller-manager-6576b87f9c-87hqt\" (UID: \"ebf8d4f5-6995-45dd-be49-491ced904443\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988730 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7ef57cce-ba04-4605-a245-edfad55f6f69-audit-policies\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988754 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqjff\" (UniqueName: \"kubernetes.io/projected/39b95b69-1fe1-4e7c-9499-ca59220ca7eb-kube-api-access-nqjff\") pod \"cluster-samples-operator-665b6dd947-kzklf\" (UID: \"39b95b69-1fe1-4e7c-9499-ca59220ca7eb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kzklf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988774 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d91d282e-a4f3-4bc9-9623-4640774f641a-serviceca\") pod \"image-pruner-29551680-hclh4\" (UID: \"d91d282e-a4f3-4bc9-9623-4640774f641a\") " pod="openshift-image-registry/image-pruner-29551680-hclh4" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988796 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6eeaccaa-9989-4a5c-91ed-67dfb7319e14-config\") pod \"machine-approver-56656f9798-sjjvc\" (UID: \"6eeaccaa-9989-4a5c-91ed-67dfb7319e14\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988818 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kxtw\" (UniqueName: \"kubernetes.io/projected/cff04257-cb6e-44cf-93de-f4e2cdf6698d-kube-api-access-2kxtw\") pod \"dns-operator-744455d44c-qnppg\" (UID: \"cff04257-cb6e-44cf-93de-f4e2cdf6698d\") " pod="openshift-dns-operator/dns-operator-744455d44c-qnppg" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988839 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39ebc592-086a-43e6-87d6-c2d67607d511-service-ca\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988867 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988904 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz5k6\" (UniqueName: \"kubernetes.io/projected/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-kube-api-access-sz5k6\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988937 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de2ea5bf-12b8-4ab0-a073-53df41c9646a-config\") pod \"kube-controller-manager-operator-78b949d7b-427rj\" (UID: \"de2ea5bf-12b8-4ab0-a073-53df41c9646a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-427rj" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988960 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5vhz\" (UniqueName: \"kubernetes.io/projected/6bbf8e83-d583-49de-a12c-6f0a3953dc67-kube-api-access-q5vhz\") pod \"router-default-5444994796-kpmwl\" (UID: \"6bbf8e83-d583-49de-a12c-6f0a3953dc67\") " pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.988983 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39ebc592-086a-43e6-87d6-c2d67607d511-trusted-ca-bundle\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989004 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f9945a5-74c6-482d-b4c4-1097e3efebe0-serving-cert\") pod \"openshift-config-operator-7777fb866f-vt2mt\" (UID: \"7f9945a5-74c6-482d-b4c4-1097e3efebe0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989038 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc8ebb5d-19af-4da1-8767-4b901d1fef8a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2cxm6\" (UID: \"dc8ebb5d-19af-4da1-8767-4b901d1fef8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989059 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b8bt\" (UniqueName: \"kubernetes.io/projected/376340e6-76cd-476b-b86b-350cb9edfecf-kube-api-access-4b8bt\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxgxz\" (UID: \"376340e6-76cd-476b-b86b-350cb9edfecf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxgxz" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989113 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f1401406-9964-4c43-8192-efd2f32732e5-image-import-ca\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989135 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5h2s\" (UniqueName: \"kubernetes.io/projected/85226c05-ab07-4243-89af-c58b7c3d1f43-kube-api-access-f5h2s\") pod \"controller-manager-879f6c89f-v25gg\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989155 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-etcd-service-ca\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989176 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cff04257-cb6e-44cf-93de-f4e2cdf6698d-metrics-tls\") pod \"dns-operator-744455d44c-qnppg\" (UID: \"cff04257-cb6e-44cf-93de-f4e2cdf6698d\") " pod="openshift-dns-operator/dns-operator-744455d44c-qnppg" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989204 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b062963a-a86b-43bb-ba19-c49fa8ac7cf1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-b492s\" (UID: \"b062963a-a86b-43bb-ba19-c49fa8ac7cf1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b492s" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989224 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9f070bb-41a2-458e-9818-063ffb52008a-serving-cert\") pod \"console-operator-58897d9998-x7t4l\" (UID: \"c9f070bb-41a2-458e-9818-063ffb52008a\") " pod="openshift-console-operator/console-operator-58897d9998-x7t4l" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989250 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwhqv\" (UniqueName: \"kubernetes.io/projected/b062963a-a86b-43bb-ba19-c49fa8ac7cf1-kube-api-access-dwhqv\") pod \"openshift-apiserver-operator-796bbdcf4f-b492s\" (UID: \"b062963a-a86b-43bb-ba19-c49fa8ac7cf1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b492s" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989272 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-config\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989292 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2a24c21-ab84-4989-a154-b8c9118c31bf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lrj5w\" (UID: \"e2a24c21-ab84-4989-a154-b8c9118c31bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989314 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1401406-9964-4c43-8192-efd2f32732e5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989338 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7ef57cce-ba04-4605-a245-edfad55f6f69-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989367 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7f9945a5-74c6-482d-b4c4-1097e3efebe0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vt2mt\" (UID: \"7f9945a5-74c6-482d-b4c4-1097e3efebe0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989388 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebf8d4f5-6995-45dd-be49-491ced904443-serving-cert\") pod \"route-controller-manager-6576b87f9c-87hqt\" (UID: \"ebf8d4f5-6995-45dd-be49-491ced904443\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989412 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6bbf8e83-d583-49de-a12c-6f0a3953dc67-stats-auth\") pod \"router-default-5444994796-kpmwl\" (UID: \"6bbf8e83-d583-49de-a12c-6f0a3953dc67\") " pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989435 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skcn8\" (UniqueName: \"kubernetes.io/projected/c9f070bb-41a2-458e-9818-063ffb52008a-kube-api-access-skcn8\") pod \"console-operator-58897d9998-x7t4l\" (UID: \"c9f070bb-41a2-458e-9818-063ffb52008a\") " pod="openshift-console-operator/console-operator-58897d9998-x7t4l" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989455 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1401406-9964-4c43-8192-efd2f32732e5-node-pullsecrets\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989509 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/376340e6-76cd-476b-b86b-350cb9edfecf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxgxz\" (UID: \"376340e6-76cd-476b-b86b-350cb9edfecf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxgxz" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989541 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f070bb-41a2-458e-9818-063ffb52008a-config\") pod \"console-operator-58897d9998-x7t4l\" (UID: \"c9f070bb-41a2-458e-9818-063ffb52008a\") " pod="openshift-console-operator/console-operator-58897d9998-x7t4l" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989562 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad986187-a8c3-4e1d-8bff-3f385af901e4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ldnm9\" (UID: \"ad986187-a8c3-4e1d-8bff-3f385af901e4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ldnm9" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989584 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f1401406-9964-4c43-8192-efd2f32732e5-etcd-client\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989603 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8ks8\" (UniqueName: \"kubernetes.io/projected/f1401406-9964-4c43-8192-efd2f32732e5-kube-api-access-j8ks8\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989625 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989665 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2tz6\" (UniqueName: \"kubernetes.io/projected/ad986187-a8c3-4e1d-8bff-3f385af901e4-kube-api-access-n2tz6\") pod \"package-server-manager-789f6589d5-ldnm9\" (UID: \"ad986187-a8c3-4e1d-8bff-3f385af901e4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ldnm9" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989686 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1401406-9964-4c43-8192-efd2f32732e5-serving-cert\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989714 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ef57cce-ba04-4605-a245-edfad55f6f69-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989734 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7ef57cce-ba04-4605-a245-edfad55f6f69-encryption-config\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989757 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-config\") pod \"controller-manager-879f6c89f-v25gg\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989779 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnkvl\" (UniqueName: \"kubernetes.io/projected/ebf8d4f5-6995-45dd-be49-491ced904443-kube-api-access-jnkvl\") pod \"route-controller-manager-6576b87f9c-87hqt\" (UID: \"ebf8d4f5-6995-45dd-be49-491ced904443\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989805 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bbf8e83-d583-49de-a12c-6f0a3953dc67-service-ca-bundle\") pod \"router-default-5444994796-kpmwl\" (UID: \"6bbf8e83-d583-49de-a12c-6f0a3953dc67\") " pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989832 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-etcd-client\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989850 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39ebc592-086a-43e6-87d6-c2d67607d511-console-config\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989876 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f1401406-9964-4c43-8192-efd2f32732e5-encryption-config\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989929 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b95b69-1fe1-4e7c-9499-ca59220ca7eb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kzklf\" (UID: \"39b95b69-1fe1-4e7c-9499-ca59220ca7eb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kzklf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989950 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2279b4a5-d4f2-455a-87e2-0774d6d57c40-serving-cert\") pod \"authentication-operator-69f744f599-ltqv2\" (UID: \"2279b4a5-d4f2-455a-87e2-0774d6d57c40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.989970 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-etcd-ca\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.990003 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72gzx\" (UniqueName: \"kubernetes.io/projected/7f9945a5-74c6-482d-b4c4-1097e3efebe0-kube-api-access-72gzx\") pod \"openshift-config-operator-7777fb866f-vt2mt\" (UID: \"7f9945a5-74c6-482d-b4c4-1097e3efebe0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.990029 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.990048 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/376340e6-76cd-476b-b86b-350cb9edfecf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxgxz\" (UID: \"376340e6-76cd-476b-b86b-350cb9edfecf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxgxz" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.990073 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shz9k\" (UniqueName: \"kubernetes.io/projected/39ebc592-086a-43e6-87d6-c2d67607d511-kube-api-access-shz9k\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.990094 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v25gg\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.990115 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7gvw\" (UniqueName: \"kubernetes.io/projected/66a5f13c-92bf-4e0b-ac5e-f14de7c6f72a-kube-api-access-t7gvw\") pod \"control-plane-machine-set-operator-78cbb6b69f-xd62p\" (UID: \"66a5f13c-92bf-4e0b-ac5e-f14de7c6f72a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xd62p" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.990134 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kc2k\" (UniqueName: \"kubernetes.io/projected/dc8ebb5d-19af-4da1-8767-4b901d1fef8a-kube-api-access-6kc2k\") pod \"ingress-operator-5b745b69d9-2cxm6\" (UID: \"dc8ebb5d-19af-4da1-8767-4b901d1fef8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.990159 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ef57cce-ba04-4605-a245-edfad55f6f69-serving-cert\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.990184 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm925\" (UniqueName: \"kubernetes.io/projected/7ef57cce-ba04-4605-a245-edfad55f6f69-kube-api-access-pm925\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.990208 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebf8d4f5-6995-45dd-be49-491ced904443-client-ca\") pod \"route-controller-manager-6576b87f9c-87hqt\" (UID: \"ebf8d4f5-6995-45dd-be49-491ced904443\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.990228 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2279b4a5-d4f2-455a-87e2-0774d6d57c40-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ltqv2\" (UID: \"2279b4a5-d4f2-455a-87e2-0774d6d57c40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.990250 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc8ebb5d-19af-4da1-8767-4b901d1fef8a-trusted-ca\") pod \"ingress-operator-5b745b69d9-2cxm6\" (UID: \"dc8ebb5d-19af-4da1-8767-4b901d1fef8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.990301 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6bbf8e83-d583-49de-a12c-6f0a3953dc67-default-certificate\") pod \"router-default-5444994796-kpmwl\" (UID: \"6bbf8e83-d583-49de-a12c-6f0a3953dc67\") " pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.993394 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f1401406-9964-4c43-8192-efd2f32732e5-audit\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.994043 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1401406-9964-4c43-8192-efd2f32732e5-config\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.998788 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-x7t4l"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999148 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29551680-hclh4"] Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999235 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-serving-cert\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999293 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39ebc592-086a-43e6-87d6-c2d67607d511-oauth-serving-cert\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999360 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7ef57cce-ba04-4605-a245-edfad55f6f69-etcd-client\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999391 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjdx4\" (UniqueName: \"kubernetes.io/projected/4b4509ca-5d20-4f5c-89ea-a910f792ff82-kube-api-access-vjdx4\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999418 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsbqs\" (UniqueName: \"kubernetes.io/projected/d91d282e-a4f3-4bc9-9623-4640774f641a-kube-api-access-gsbqs\") pod \"image-pruner-29551680-hclh4\" (UID: \"d91d282e-a4f3-4bc9-9623-4640774f641a\") " pod="openshift-image-registry/image-pruner-29551680-hclh4" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999464 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999497 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e5131de2-27e3-4ec2-8a59-7cc11f6865be-srv-cert\") pod \"catalog-operator-68c6474976-gcx4m\" (UID: \"e5131de2-27e3-4ec2-8a59-7cc11f6865be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999525 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7r28\" (UniqueName: \"kubernetes.io/projected/2279b4a5-d4f2-455a-87e2-0774d6d57c40-kube-api-access-l7r28\") pod \"authentication-operator-69f744f599-ltqv2\" (UID: \"2279b4a5-d4f2-455a-87e2-0774d6d57c40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999550 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6eeaccaa-9989-4a5c-91ed-67dfb7319e14-auth-proxy-config\") pod \"machine-approver-56656f9798-sjjvc\" (UID: \"6eeaccaa-9989-4a5c-91ed-67dfb7319e14\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999576 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6eeaccaa-9989-4a5c-91ed-67dfb7319e14-machine-approver-tls\") pod \"machine-approver-56656f9798-sjjvc\" (UID: \"6eeaccaa-9989-4a5c-91ed-67dfb7319e14\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999615 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b062963a-a86b-43bb-ba19-c49fa8ac7cf1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-b492s\" (UID: \"b062963a-a86b-43bb-ba19-c49fa8ac7cf1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b492s" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999664 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fxxr\" (UniqueName: \"kubernetes.io/projected/e2a24c21-ab84-4989-a154-b8c9118c31bf-kube-api-access-6fxxr\") pod \"machine-api-operator-5694c8668f-lrj5w\" (UID: \"e2a24c21-ab84-4989-a154-b8c9118c31bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999705 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1401406-9964-4c43-8192-efd2f32732e5-audit-dir\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999732 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7ef57cce-ba04-4605-a245-edfad55f6f69-audit-dir\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999756 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b4509ca-5d20-4f5c-89ea-a910f792ff82-audit-dir\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999781 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2279b4a5-d4f2-455a-87e2-0774d6d57c40-service-ca-bundle\") pod \"authentication-operator-69f744f599-ltqv2\" (UID: \"2279b4a5-d4f2-455a-87e2-0774d6d57c40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999806 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dc8ebb5d-19af-4da1-8767-4b901d1fef8a-metrics-tls\") pod \"ingress-operator-5b745b69d9-2cxm6\" (UID: \"dc8ebb5d-19af-4da1-8767-4b901d1fef8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999827 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfmln\" (UniqueName: \"kubernetes.io/projected/6eeaccaa-9989-4a5c-91ed-67dfb7319e14-kube-api-access-vfmln\") pod \"machine-approver-56656f9798-sjjvc\" (UID: \"6eeaccaa-9989-4a5c-91ed-67dfb7319e14\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999856 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f1401406-9964-4c43-8192-efd2f32732e5-etcd-serving-ca\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999881 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e2a24c21-ab84-4989-a154-b8c9118c31bf-images\") pod \"machine-api-operator-5694c8668f-lrj5w\" (UID: \"e2a24c21-ab84-4989-a154-b8c9118c31bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" Mar 10 00:09:07 crc kubenswrapper[4906]: I0310 00:09:07.999907 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de2ea5bf-12b8-4ab0-a073-53df41c9646a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-427rj\" (UID: \"de2ea5bf-12b8-4ab0-a073-53df41c9646a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-427rj" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:07.999926 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff15a729-bbdf-4422-8db2-0e429e76ee25-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2g9zb\" (UID: \"ff15a729-bbdf-4422-8db2-0e429e76ee25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:07.999961 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:07.999985 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000007 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000032 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-audit-policies\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000054 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000079 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85226c05-ab07-4243-89af-c58b7c3d1f43-serving-cert\") pod \"controller-manager-879f6c89f-v25gg\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000106 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6f87\" (UniqueName: \"kubernetes.io/projected/e5131de2-27e3-4ec2-8a59-7cc11f6865be-kube-api-access-x6f87\") pod \"catalog-operator-68c6474976-gcx4m\" (UID: \"e5131de2-27e3-4ec2-8a59-7cc11f6865be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000129 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/68de0039-27ee-4f02-987f-9105c69c355f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gtbds\" (UID: \"68de0039-27ee-4f02-987f-9105c69c355f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gtbds" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000203 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000222 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e5131de2-27e3-4ec2-8a59-7cc11f6865be-profile-collector-cert\") pod \"catalog-operator-68c6474976-gcx4m\" (UID: \"e5131de2-27e3-4ec2-8a59-7cc11f6865be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000247 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bbf8e83-d583-49de-a12c-6f0a3953dc67-metrics-certs\") pod \"router-default-5444994796-kpmwl\" (UID: \"6bbf8e83-d583-49de-a12c-6f0a3953dc67\") " pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000270 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff15a729-bbdf-4422-8db2-0e429e76ee25-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2g9zb\" (UID: \"ff15a729-bbdf-4422-8db2-0e429e76ee25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000292 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff15a729-bbdf-4422-8db2-0e429e76ee25-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2g9zb\" (UID: \"ff15a729-bbdf-4422-8db2-0e429e76ee25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000324 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9f070bb-41a2-458e-9818-063ffb52008a-trusted-ca\") pod \"console-operator-58897d9998-x7t4l\" (UID: \"c9f070bb-41a2-458e-9818-063ffb52008a\") " pod="openshift-console-operator/console-operator-58897d9998-x7t4l" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000349 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hs46\" (UniqueName: \"kubernetes.io/projected/1c418134-7dc2-42f5-b1ce-c22a2b77a4b9-kube-api-access-6hs46\") pod \"downloads-7954f5f757-zkqrr\" (UID: \"1c418134-7dc2-42f5-b1ce-c22a2b77a4b9\") " pod="openshift-console/downloads-7954f5f757-zkqrr" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000372 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcfd4\" (UniqueName: \"kubernetes.io/projected/ff15a729-bbdf-4422-8db2-0e429e76ee25-kube-api-access-fcfd4\") pod \"cluster-image-registry-operator-dc59b4c8b-2g9zb\" (UID: \"ff15a729-bbdf-4422-8db2-0e429e76ee25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000395 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39ebc592-086a-43e6-87d6-c2d67607d511-console-serving-cert\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000429 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000451 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dfsl\" (UniqueName: \"kubernetes.io/projected/68de0039-27ee-4f02-987f-9105c69c355f-kube-api-access-6dfsl\") pod \"multus-admission-controller-857f4d67dd-gtbds\" (UID: \"68de0039-27ee-4f02-987f-9105c69c355f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gtbds" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000477 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39ebc592-086a-43e6-87d6-c2d67607d511-console-oauth-config\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.000761 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xzvmk" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.001301 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.005481 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-client-ca\") pod \"controller-manager-879f6c89f-v25gg\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.009695 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9rn4f"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.010092 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2a24c21-ab84-4989-a154-b8c9118c31bf-config\") pod \"machine-api-operator-5694c8668f-lrj5w\" (UID: \"e2a24c21-ab84-4989-a154-b8c9118c31bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.011688 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b062963a-a86b-43bb-ba19-c49fa8ac7cf1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-b492s\" (UID: \"b062963a-a86b-43bb-ba19-c49fa8ac7cf1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b492s" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.012255 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.012559 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f1401406-9964-4c43-8192-efd2f32732e5-image-import-ca\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.012621 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ef57cce-ba04-4605-a245-edfad55f6f69-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.014106 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7ef57cce-ba04-4605-a245-edfad55f6f69-audit-policies\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.014257 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.014856 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7ef57cce-ba04-4605-a245-edfad55f6f69-etcd-client\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.015178 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1401406-9964-4c43-8192-efd2f32732e5-serving-cert\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.016709 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9f070bb-41a2-458e-9818-063ffb52008a-serving-cert\") pod \"console-operator-58897d9998-x7t4l\" (UID: \"c9f070bb-41a2-458e-9818-063ffb52008a\") " pod="openshift-console-operator/console-operator-58897d9998-x7t4l" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.017127 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f1401406-9964-4c43-8192-efd2f32732e5-etcd-client\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.018799 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.025773 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.026024 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.027038 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mq564"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.027074 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.027107 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lrj5w"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.027125 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ldnm9"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.027139 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v25gg"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.027248 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9rn4f" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.027476 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.029352 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.030591 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v25gg\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.030919 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.031200 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1401406-9964-4c43-8192-efd2f32732e5-node-pullsecrets\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.031843 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f1401406-9964-4c43-8192-efd2f32732e5-etcd-serving-ca\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.031896 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7ef57cce-ba04-4605-a245-edfad55f6f69-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.032059 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-config\") pod \"controller-manager-879f6c89f-v25gg\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.032568 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e2a24c21-ab84-4989-a154-b8c9118c31bf-images\") pod \"machine-api-operator-5694c8668f-lrj5w\" (UID: \"e2a24c21-ab84-4989-a154-b8c9118c31bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.032944 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1401406-9964-4c43-8192-efd2f32732e5-trusted-ca-bundle\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.033041 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1401406-9964-4c43-8192-efd2f32732e5-audit-dir\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.033102 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f070bb-41a2-458e-9818-063ffb52008a-config\") pod \"console-operator-58897d9998-x7t4l\" (UID: \"c9f070bb-41a2-458e-9818-063ffb52008a\") " pod="openshift-console-operator/console-operator-58897d9998-x7t4l" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.033230 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7ef57cce-ba04-4605-a245-edfad55f6f69-audit-dir\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.033458 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b4509ca-5d20-4f5c-89ea-a910f792ff82-audit-dir\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.033506 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7ef57cce-ba04-4605-a245-edfad55f6f69-encryption-config\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.033683 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.033891 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-audit-policies\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.034649 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9f070bb-41a2-458e-9818-063ffb52008a-trusted-ca\") pod \"console-operator-58897d9998-x7t4l\" (UID: \"c9f070bb-41a2-458e-9818-063ffb52008a\") " pod="openshift-console-operator/console-operator-58897d9998-x7t4l" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.035141 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.035418 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.036931 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b95b69-1fe1-4e7c-9499-ca59220ca7eb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kzklf\" (UID: \"39b95b69-1fe1-4e7c-9499-ca59220ca7eb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kzklf" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.036950 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ef57cce-ba04-4605-a245-edfad55f6f69-serving-cert\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.037324 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.037352 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.037496 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.037548 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.037615 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.037646 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85226c05-ab07-4243-89af-c58b7c3d1f43-serving-cert\") pod \"controller-manager-879f6c89f-v25gg\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.038862 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.039858 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2a24c21-ab84-4989-a154-b8c9118c31bf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lrj5w\" (UID: \"e2a24c21-ab84-4989-a154-b8c9118c31bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.039987 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gtbds"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.041117 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b062963a-a86b-43bb-ba19-c49fa8ac7cf1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-b492s\" (UID: \"b062963a-a86b-43bb-ba19-c49fa8ac7cf1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b492s" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.041300 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-427rj"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.044089 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-66jxp"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.045221 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xd62p"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.046448 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.047409 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ltqv2"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.048468 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.049884 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-q9zx6"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.050606 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x6g9x"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.052164 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b4rmk"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.052180 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f1401406-9964-4c43-8192-efd2f32732e5-encryption-config\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.052496 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.053095 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qnppg"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.053483 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b4rmk" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.054265 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nkxsv"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.055281 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxgxz"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.058814 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.061441 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.063297 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qfc95"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.067701 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x6g9x"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.070405 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kfw9l"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.071469 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fvmnw"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.072550 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vw9rg"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.073746 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b4rmk"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.074991 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-krjqb"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.075211 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.076192 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.077269 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xzvmk"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.078351 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551688-fkkqj"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.079582 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgvms"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.080720 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.081610 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.082726 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-j4vpr"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.084146 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j4vpr" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.084657 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j4vpr"] Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.096301 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.104710 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnkvl\" (UniqueName: \"kubernetes.io/projected/ebf8d4f5-6995-45dd-be49-491ced904443-kube-api-access-jnkvl\") pod \"route-controller-manager-6576b87f9c-87hqt\" (UID: \"ebf8d4f5-6995-45dd-be49-491ced904443\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.104766 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bbf8e83-d583-49de-a12c-6f0a3953dc67-service-ca-bundle\") pod \"router-default-5444994796-kpmwl\" (UID: \"6bbf8e83-d583-49de-a12c-6f0a3953dc67\") " pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.104791 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-etcd-client\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.104811 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2279b4a5-d4f2-455a-87e2-0774d6d57c40-serving-cert\") pod \"authentication-operator-69f744f599-ltqv2\" (UID: \"2279b4a5-d4f2-455a-87e2-0774d6d57c40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.104828 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-etcd-ca\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.104848 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39ebc592-086a-43e6-87d6-c2d67607d511-console-config\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.104869 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72gzx\" (UniqueName: \"kubernetes.io/projected/7f9945a5-74c6-482d-b4c4-1097e3efebe0-kube-api-access-72gzx\") pod \"openshift-config-operator-7777fb866f-vt2mt\" (UID: \"7f9945a5-74c6-482d-b4c4-1097e3efebe0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.104888 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/376340e6-76cd-476b-b86b-350cb9edfecf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxgxz\" (UID: \"376340e6-76cd-476b-b86b-350cb9edfecf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxgxz" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.104909 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shz9k\" (UniqueName: \"kubernetes.io/projected/39ebc592-086a-43e6-87d6-c2d67607d511-kube-api-access-shz9k\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.104930 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7gvw\" (UniqueName: \"kubernetes.io/projected/66a5f13c-92bf-4e0b-ac5e-f14de7c6f72a-kube-api-access-t7gvw\") pod \"control-plane-machine-set-operator-78cbb6b69f-xd62p\" (UID: \"66a5f13c-92bf-4e0b-ac5e-f14de7c6f72a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xd62p" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.104950 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kc2k\" (UniqueName: \"kubernetes.io/projected/dc8ebb5d-19af-4da1-8767-4b901d1fef8a-kube-api-access-6kc2k\") pod \"ingress-operator-5b745b69d9-2cxm6\" (UID: \"dc8ebb5d-19af-4da1-8767-4b901d1fef8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.104976 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebf8d4f5-6995-45dd-be49-491ced904443-client-ca\") pod \"route-controller-manager-6576b87f9c-87hqt\" (UID: \"ebf8d4f5-6995-45dd-be49-491ced904443\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.104991 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2279b4a5-d4f2-455a-87e2-0774d6d57c40-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ltqv2\" (UID: \"2279b4a5-d4f2-455a-87e2-0774d6d57c40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105007 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc8ebb5d-19af-4da1-8767-4b901d1fef8a-trusted-ca\") pod \"ingress-operator-5b745b69d9-2cxm6\" (UID: \"dc8ebb5d-19af-4da1-8767-4b901d1fef8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105035 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6bbf8e83-d583-49de-a12c-6f0a3953dc67-default-certificate\") pod \"router-default-5444994796-kpmwl\" (UID: \"6bbf8e83-d583-49de-a12c-6f0a3953dc67\") " pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105051 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-serving-cert\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105066 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39ebc592-086a-43e6-87d6-c2d67607d511-oauth-serving-cert\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105090 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsbqs\" (UniqueName: \"kubernetes.io/projected/d91d282e-a4f3-4bc9-9623-4640774f641a-kube-api-access-gsbqs\") pod \"image-pruner-29551680-hclh4\" (UID: \"d91d282e-a4f3-4bc9-9623-4640774f641a\") " pod="openshift-image-registry/image-pruner-29551680-hclh4" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105111 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e5131de2-27e3-4ec2-8a59-7cc11f6865be-srv-cert\") pod \"catalog-operator-68c6474976-gcx4m\" (UID: \"e5131de2-27e3-4ec2-8a59-7cc11f6865be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105126 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7r28\" (UniqueName: \"kubernetes.io/projected/2279b4a5-d4f2-455a-87e2-0774d6d57c40-kube-api-access-l7r28\") pod \"authentication-operator-69f744f599-ltqv2\" (UID: \"2279b4a5-d4f2-455a-87e2-0774d6d57c40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105145 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6eeaccaa-9989-4a5c-91ed-67dfb7319e14-auth-proxy-config\") pod \"machine-approver-56656f9798-sjjvc\" (UID: \"6eeaccaa-9989-4a5c-91ed-67dfb7319e14\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105163 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6eeaccaa-9989-4a5c-91ed-67dfb7319e14-machine-approver-tls\") pod \"machine-approver-56656f9798-sjjvc\" (UID: \"6eeaccaa-9989-4a5c-91ed-67dfb7319e14\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105212 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2279b4a5-d4f2-455a-87e2-0774d6d57c40-service-ca-bundle\") pod \"authentication-operator-69f744f599-ltqv2\" (UID: \"2279b4a5-d4f2-455a-87e2-0774d6d57c40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105232 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dc8ebb5d-19af-4da1-8767-4b901d1fef8a-metrics-tls\") pod \"ingress-operator-5b745b69d9-2cxm6\" (UID: \"dc8ebb5d-19af-4da1-8767-4b901d1fef8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105250 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de2ea5bf-12b8-4ab0-a073-53df41c9646a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-427rj\" (UID: \"de2ea5bf-12b8-4ab0-a073-53df41c9646a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-427rj" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105267 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff15a729-bbdf-4422-8db2-0e429e76ee25-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2g9zb\" (UID: \"ff15a729-bbdf-4422-8db2-0e429e76ee25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105304 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfmln\" (UniqueName: \"kubernetes.io/projected/6eeaccaa-9989-4a5c-91ed-67dfb7319e14-kube-api-access-vfmln\") pod \"machine-approver-56656f9798-sjjvc\" (UID: \"6eeaccaa-9989-4a5c-91ed-67dfb7319e14\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105329 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6f87\" (UniqueName: \"kubernetes.io/projected/e5131de2-27e3-4ec2-8a59-7cc11f6865be-kube-api-access-x6f87\") pod \"catalog-operator-68c6474976-gcx4m\" (UID: \"e5131de2-27e3-4ec2-8a59-7cc11f6865be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105344 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/68de0039-27ee-4f02-987f-9105c69c355f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gtbds\" (UID: \"68de0039-27ee-4f02-987f-9105c69c355f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gtbds" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105403 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e5131de2-27e3-4ec2-8a59-7cc11f6865be-profile-collector-cert\") pod \"catalog-operator-68c6474976-gcx4m\" (UID: \"e5131de2-27e3-4ec2-8a59-7cc11f6865be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105428 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bbf8e83-d583-49de-a12c-6f0a3953dc67-metrics-certs\") pod \"router-default-5444994796-kpmwl\" (UID: \"6bbf8e83-d583-49de-a12c-6f0a3953dc67\") " pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105468 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff15a729-bbdf-4422-8db2-0e429e76ee25-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2g9zb\" (UID: \"ff15a729-bbdf-4422-8db2-0e429e76ee25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105488 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hs46\" (UniqueName: \"kubernetes.io/projected/1c418134-7dc2-42f5-b1ce-c22a2b77a4b9-kube-api-access-6hs46\") pod \"downloads-7954f5f757-zkqrr\" (UID: \"1c418134-7dc2-42f5-b1ce-c22a2b77a4b9\") " pod="openshift-console/downloads-7954f5f757-zkqrr" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105508 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff15a729-bbdf-4422-8db2-0e429e76ee25-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2g9zb\" (UID: \"ff15a729-bbdf-4422-8db2-0e429e76ee25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105549 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcfd4\" (UniqueName: \"kubernetes.io/projected/ff15a729-bbdf-4422-8db2-0e429e76ee25-kube-api-access-fcfd4\") pod \"cluster-image-registry-operator-dc59b4c8b-2g9zb\" (UID: \"ff15a729-bbdf-4422-8db2-0e429e76ee25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105569 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39ebc592-086a-43e6-87d6-c2d67607d511-console-serving-cert\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105612 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dfsl\" (UniqueName: \"kubernetes.io/projected/68de0039-27ee-4f02-987f-9105c69c355f-kube-api-access-6dfsl\") pod \"multus-admission-controller-857f4d67dd-gtbds\" (UID: \"68de0039-27ee-4f02-987f-9105c69c355f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gtbds" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105628 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39ebc592-086a-43e6-87d6-c2d67607d511-console-oauth-config\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105667 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de2ea5bf-12b8-4ab0-a073-53df41c9646a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-427rj\" (UID: \"de2ea5bf-12b8-4ab0-a073-53df41c9646a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-427rj" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105686 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2279b4a5-d4f2-455a-87e2-0774d6d57c40-config\") pod \"authentication-operator-69f744f599-ltqv2\" (UID: \"2279b4a5-d4f2-455a-87e2-0774d6d57c40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105705 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/66a5f13c-92bf-4e0b-ac5e-f14de7c6f72a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xd62p\" (UID: \"66a5f13c-92bf-4e0b-ac5e-f14de7c6f72a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xd62p" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105751 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebf8d4f5-6995-45dd-be49-491ced904443-config\") pod \"route-controller-manager-6576b87f9c-87hqt\" (UID: \"ebf8d4f5-6995-45dd-be49-491ced904443\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105774 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d91d282e-a4f3-4bc9-9623-4640774f641a-serviceca\") pod \"image-pruner-29551680-hclh4\" (UID: \"d91d282e-a4f3-4bc9-9623-4640774f641a\") " pod="openshift-image-registry/image-pruner-29551680-hclh4" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105789 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6eeaccaa-9989-4a5c-91ed-67dfb7319e14-config\") pod \"machine-approver-56656f9798-sjjvc\" (UID: \"6eeaccaa-9989-4a5c-91ed-67dfb7319e14\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105829 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kxtw\" (UniqueName: \"kubernetes.io/projected/cff04257-cb6e-44cf-93de-f4e2cdf6698d-kube-api-access-2kxtw\") pod \"dns-operator-744455d44c-qnppg\" (UID: \"cff04257-cb6e-44cf-93de-f4e2cdf6698d\") " pod="openshift-dns-operator/dns-operator-744455d44c-qnppg" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105850 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz5k6\" (UniqueName: \"kubernetes.io/projected/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-kube-api-access-sz5k6\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105867 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39ebc592-086a-43e6-87d6-c2d67607d511-service-ca\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105867 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/376340e6-76cd-476b-b86b-350cb9edfecf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxgxz\" (UID: \"376340e6-76cd-476b-b86b-350cb9edfecf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxgxz" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105885 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de2ea5bf-12b8-4ab0-a073-53df41c9646a-config\") pod \"kube-controller-manager-operator-78b949d7b-427rj\" (UID: \"de2ea5bf-12b8-4ab0-a073-53df41c9646a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-427rj" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105903 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5vhz\" (UniqueName: \"kubernetes.io/projected/6bbf8e83-d583-49de-a12c-6f0a3953dc67-kube-api-access-q5vhz\") pod \"router-default-5444994796-kpmwl\" (UID: \"6bbf8e83-d583-49de-a12c-6f0a3953dc67\") " pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105924 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39ebc592-086a-43e6-87d6-c2d67607d511-trusted-ca-bundle\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105924 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-etcd-ca\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105946 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f9945a5-74c6-482d-b4c4-1097e3efebe0-serving-cert\") pod \"openshift-config-operator-7777fb866f-vt2mt\" (UID: \"7f9945a5-74c6-482d-b4c4-1097e3efebe0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105964 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc8ebb5d-19af-4da1-8767-4b901d1fef8a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2cxm6\" (UID: \"dc8ebb5d-19af-4da1-8767-4b901d1fef8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.105982 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b8bt\" (UniqueName: \"kubernetes.io/projected/376340e6-76cd-476b-b86b-350cb9edfecf-kube-api-access-4b8bt\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxgxz\" (UID: \"376340e6-76cd-476b-b86b-350cb9edfecf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxgxz" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.106015 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-etcd-service-ca\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.106034 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cff04257-cb6e-44cf-93de-f4e2cdf6698d-metrics-tls\") pod \"dns-operator-744455d44c-qnppg\" (UID: \"cff04257-cb6e-44cf-93de-f4e2cdf6698d\") " pod="openshift-dns-operator/dns-operator-744455d44c-qnppg" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.106057 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-config\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.106075 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7f9945a5-74c6-482d-b4c4-1097e3efebe0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vt2mt\" (UID: \"7f9945a5-74c6-482d-b4c4-1097e3efebe0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.106092 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebf8d4f5-6995-45dd-be49-491ced904443-serving-cert\") pod \"route-controller-manager-6576b87f9c-87hqt\" (UID: \"ebf8d4f5-6995-45dd-be49-491ced904443\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.106108 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6bbf8e83-d583-49de-a12c-6f0a3953dc67-stats-auth\") pod \"router-default-5444994796-kpmwl\" (UID: \"6bbf8e83-d583-49de-a12c-6f0a3953dc67\") " pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.106133 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/376340e6-76cd-476b-b86b-350cb9edfecf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxgxz\" (UID: \"376340e6-76cd-476b-b86b-350cb9edfecf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxgxz" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.106154 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad986187-a8c3-4e1d-8bff-3f385af901e4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ldnm9\" (UID: \"ad986187-a8c3-4e1d-8bff-3f385af901e4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ldnm9" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.106183 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2tz6\" (UniqueName: \"kubernetes.io/projected/ad986187-a8c3-4e1d-8bff-3f385af901e4-kube-api-access-n2tz6\") pod \"package-server-manager-789f6589d5-ldnm9\" (UID: \"ad986187-a8c3-4e1d-8bff-3f385af901e4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ldnm9" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.106537 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39ebc592-086a-43e6-87d6-c2d67607d511-console-config\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.107463 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2279b4a5-d4f2-455a-87e2-0774d6d57c40-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ltqv2\" (UID: \"2279b4a5-d4f2-455a-87e2-0774d6d57c40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.108541 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39ebc592-086a-43e6-87d6-c2d67607d511-oauth-serving-cert\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.108652 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6eeaccaa-9989-4a5c-91ed-67dfb7319e14-auth-proxy-config\") pod \"machine-approver-56656f9798-sjjvc\" (UID: \"6eeaccaa-9989-4a5c-91ed-67dfb7319e14\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.108656 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d91d282e-a4f3-4bc9-9623-4640774f641a-serviceca\") pod \"image-pruner-29551680-hclh4\" (UID: \"d91d282e-a4f3-4bc9-9623-4640774f641a\") " pod="openshift-image-registry/image-pruner-29551680-hclh4" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.108750 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39ebc592-086a-43e6-87d6-c2d67607d511-trusted-ca-bundle\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.109061 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2279b4a5-d4f2-455a-87e2-0774d6d57c40-serving-cert\") pod \"authentication-operator-69f744f599-ltqv2\" (UID: \"2279b4a5-d4f2-455a-87e2-0774d6d57c40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.109087 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff15a729-bbdf-4422-8db2-0e429e76ee25-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-2g9zb\" (UID: \"ff15a729-bbdf-4422-8db2-0e429e76ee25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.109256 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2279b4a5-d4f2-455a-87e2-0774d6d57c40-config\") pod \"authentication-operator-69f744f599-ltqv2\" (UID: \"2279b4a5-d4f2-455a-87e2-0774d6d57c40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.109278 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebf8d4f5-6995-45dd-be49-491ced904443-client-ca\") pod \"route-controller-manager-6576b87f9c-87hqt\" (UID: \"ebf8d4f5-6995-45dd-be49-491ced904443\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.109782 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebf8d4f5-6995-45dd-be49-491ced904443-config\") pod \"route-controller-manager-6576b87f9c-87hqt\" (UID: \"ebf8d4f5-6995-45dd-be49-491ced904443\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.109863 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6eeaccaa-9989-4a5c-91ed-67dfb7319e14-config\") pod \"machine-approver-56656f9798-sjjvc\" (UID: \"6eeaccaa-9989-4a5c-91ed-67dfb7319e14\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.109940 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-config\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.110267 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7f9945a5-74c6-482d-b4c4-1097e3efebe0-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vt2mt\" (UID: \"7f9945a5-74c6-482d-b4c4-1097e3efebe0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.110770 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39ebc592-086a-43e6-87d6-c2d67607d511-service-ca\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.110808 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-etcd-service-ca\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.111007 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-serving-cert\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.111500 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2279b4a5-d4f2-455a-87e2-0774d6d57c40-service-ca-bundle\") pod \"authentication-operator-69f744f599-ltqv2\" (UID: \"2279b4a5-d4f2-455a-87e2-0774d6d57c40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.111857 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f9945a5-74c6-482d-b4c4-1097e3efebe0-serving-cert\") pod \"openshift-config-operator-7777fb866f-vt2mt\" (UID: \"7f9945a5-74c6-482d-b4c4-1097e3efebe0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.111925 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-etcd-client\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.112799 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff15a729-bbdf-4422-8db2-0e429e76ee25-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-2g9zb\" (UID: \"ff15a729-bbdf-4422-8db2-0e429e76ee25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.113210 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39ebc592-086a-43e6-87d6-c2d67607d511-console-serving-cert\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.113330 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/376340e6-76cd-476b-b86b-350cb9edfecf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxgxz\" (UID: \"376340e6-76cd-476b-b86b-350cb9edfecf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxgxz" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.113373 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebf8d4f5-6995-45dd-be49-491ced904443-serving-cert\") pod \"route-controller-manager-6576b87f9c-87hqt\" (UID: \"ebf8d4f5-6995-45dd-be49-491ced904443\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.113497 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6eeaccaa-9989-4a5c-91ed-67dfb7319e14-machine-approver-tls\") pod \"machine-approver-56656f9798-sjjvc\" (UID: \"6eeaccaa-9989-4a5c-91ed-67dfb7319e14\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.114064 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39ebc592-086a-43e6-87d6-c2d67607d511-console-oauth-config\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.116742 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.129778 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cff04257-cb6e-44cf-93de-f4e2cdf6698d-metrics-tls\") pod \"dns-operator-744455d44c-qnppg\" (UID: \"cff04257-cb6e-44cf-93de-f4e2cdf6698d\") " pod="openshift-dns-operator/dns-operator-744455d44c-qnppg" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.137009 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.156842 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.176457 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.196158 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.216612 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.237777 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.256226 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.267259 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dc8ebb5d-19af-4da1-8767-4b901d1fef8a-metrics-tls\") pod \"ingress-operator-5b745b69d9-2cxm6\" (UID: \"dc8ebb5d-19af-4da1-8767-4b901d1fef8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.281590 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.290052 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc8ebb5d-19af-4da1-8767-4b901d1fef8a-trusted-ca\") pod \"ingress-operator-5b745b69d9-2cxm6\" (UID: \"dc8ebb5d-19af-4da1-8767-4b901d1fef8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.295563 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.318531 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.336023 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.356887 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.362355 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de2ea5bf-12b8-4ab0-a073-53df41c9646a-config\") pod \"kube-controller-manager-operator-78b949d7b-427rj\" (UID: \"de2ea5bf-12b8-4ab0-a073-53df41c9646a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-427rj" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.376617 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.396788 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.406482 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bbf8e83-d583-49de-a12c-6f0a3953dc67-service-ca-bundle\") pod \"router-default-5444994796-kpmwl\" (UID: \"6bbf8e83-d583-49de-a12c-6f0a3953dc67\") " pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.416787 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.435865 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.443446 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bbf8e83-d583-49de-a12c-6f0a3953dc67-metrics-certs\") pod \"router-default-5444994796-kpmwl\" (UID: \"6bbf8e83-d583-49de-a12c-6f0a3953dc67\") " pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.457335 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.476313 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.484501 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de2ea5bf-12b8-4ab0-a073-53df41c9646a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-427rj\" (UID: \"de2ea5bf-12b8-4ab0-a073-53df41c9646a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-427rj" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.496292 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.529285 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.536887 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.548219 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad986187-a8c3-4e1d-8bff-3f385af901e4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ldnm9\" (UID: \"ad986187-a8c3-4e1d-8bff-3f385af901e4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ldnm9" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.557432 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.564725 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6bbf8e83-d583-49de-a12c-6f0a3953dc67-default-certificate\") pod \"router-default-5444994796-kpmwl\" (UID: \"6bbf8e83-d583-49de-a12c-6f0a3953dc67\") " pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.576248 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.587303 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6bbf8e83-d583-49de-a12c-6f0a3953dc67-stats-auth\") pod \"router-default-5444994796-kpmwl\" (UID: \"6bbf8e83-d583-49de-a12c-6f0a3953dc67\") " pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.597176 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.617136 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.622834 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/68de0039-27ee-4f02-987f-9105c69c355f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gtbds\" (UID: \"68de0039-27ee-4f02-987f-9105c69c355f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gtbds" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.637295 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.643138 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e5131de2-27e3-4ec2-8a59-7cc11f6865be-profile-collector-cert\") pod \"catalog-operator-68c6474976-gcx4m\" (UID: \"e5131de2-27e3-4ec2-8a59-7cc11f6865be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.656826 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.677039 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.697028 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.703474 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e5131de2-27e3-4ec2-8a59-7cc11f6865be-srv-cert\") pod \"catalog-operator-68c6474976-gcx4m\" (UID: \"e5131de2-27e3-4ec2-8a59-7cc11f6865be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.737170 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.756747 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.777775 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.822602 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.825897 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.837311 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.857021 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.863398 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/66a5f13c-92bf-4e0b-ac5e-f14de7c6f72a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xd62p\" (UID: \"66a5f13c-92bf-4e0b-ac5e-f14de7c6f72a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xd62p" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.876442 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.897455 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.914453 4906 request.go:700] Waited for 1.006029635s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.918109 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.937566 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.957391 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.977712 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 00:09:08 crc kubenswrapper[4906]: I0310 00:09:08.998139 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.018395 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.037178 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.058012 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.077836 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.096884 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.118127 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.137449 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.157442 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.178071 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.197075 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.217836 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.237838 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.257134 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.277403 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.297274 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.317384 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.338335 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.357816 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.378168 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.397164 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.416807 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.437193 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.456906 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.476353 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.497711 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.517008 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.537607 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.557578 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.576439 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.632870 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjdx4\" (UniqueName: \"kubernetes.io/projected/4b4509ca-5d20-4f5c-89ea-a910f792ff82-kube-api-access-vjdx4\") pod \"oauth-openshift-558db77b4-mq564\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.648337 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fxxr\" (UniqueName: \"kubernetes.io/projected/e2a24c21-ab84-4989-a154-b8c9118c31bf-kube-api-access-6fxxr\") pod \"machine-api-operator-5694c8668f-lrj5w\" (UID: \"e2a24c21-ab84-4989-a154-b8c9118c31bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.670759 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8ks8\" (UniqueName: \"kubernetes.io/projected/f1401406-9964-4c43-8192-efd2f32732e5-kube-api-access-j8ks8\") pod \"apiserver-76f77b778f-t5bxf\" (UID: \"f1401406-9964-4c43-8192-efd2f32732e5\") " pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.685187 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqjff\" (UniqueName: \"kubernetes.io/projected/39b95b69-1fe1-4e7c-9499-ca59220ca7eb-kube-api-access-nqjff\") pod \"cluster-samples-operator-665b6dd947-kzklf\" (UID: \"39b95b69-1fe1-4e7c-9499-ca59220ca7eb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kzklf" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.696959 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.709300 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwhqv\" (UniqueName: \"kubernetes.io/projected/b062963a-a86b-43bb-ba19-c49fa8ac7cf1-kube-api-access-dwhqv\") pod \"openshift-apiserver-operator-796bbdcf4f-b492s\" (UID: \"b062963a-a86b-43bb-ba19-c49fa8ac7cf1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b492s" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.716913 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.740596 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.760033 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.780233 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5h2s\" (UniqueName: \"kubernetes.io/projected/85226c05-ab07-4243-89af-c58b7c3d1f43-kube-api-access-f5h2s\") pod \"controller-manager-879f6c89f-v25gg\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.800452 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skcn8\" (UniqueName: \"kubernetes.io/projected/c9f070bb-41a2-458e-9818-063ffb52008a-kube-api-access-skcn8\") pod \"console-operator-58897d9998-x7t4l\" (UID: \"c9f070bb-41a2-458e-9818-063ffb52008a\") " pod="openshift-console-operator/console-operator-58897d9998-x7t4l" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.812428 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-x7t4l" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.816729 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.823443 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm925\" (UniqueName: \"kubernetes.io/projected/7ef57cce-ba04-4605-a245-edfad55f6f69-kube-api-access-pm925\") pod \"apiserver-7bbb656c7d-jws8x\" (UID: \"7ef57cce-ba04-4605-a245-edfad55f6f69\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.839179 4906 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.856135 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.861514 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.879611 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.897673 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.903194 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.917403 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.919750 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b492s" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.934446 4906 request.go:700] Waited for 1.880622993s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.937495 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.956922 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kzklf" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.976297 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.990991 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:09 crc kubenswrapper[4906]: I0310 00:09:09.998149 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.017271 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.019143 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.065793 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnkvl\" (UniqueName: \"kubernetes.io/projected/ebf8d4f5-6995-45dd-be49-491ced904443-kube-api-access-jnkvl\") pod \"route-controller-manager-6576b87f9c-87hqt\" (UID: \"ebf8d4f5-6995-45dd-be49-491ced904443\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.087240 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2tz6\" (UniqueName: \"kubernetes.io/projected/ad986187-a8c3-4e1d-8bff-3f385af901e4-kube-api-access-n2tz6\") pod \"package-server-manager-789f6589d5-ldnm9\" (UID: \"ad986187-a8c3-4e1d-8bff-3f385af901e4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ldnm9" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.095188 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shz9k\" (UniqueName: \"kubernetes.io/projected/39ebc592-086a-43e6-87d6-c2d67607d511-kube-api-access-shz9k\") pod \"console-f9d7485db-q9zx6\" (UID: \"39ebc592-086a-43e6-87d6-c2d67607d511\") " pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.110970 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lrj5w"] Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.111043 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-x7t4l"] Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.150012 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kc2k\" (UniqueName: \"kubernetes.io/projected/dc8ebb5d-19af-4da1-8767-4b901d1fef8a-kube-api-access-6kc2k\") pod \"ingress-operator-5b745b69d9-2cxm6\" (UID: \"dc8ebb5d-19af-4da1-8767-4b901d1fef8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.158421 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7gvw\" (UniqueName: \"kubernetes.io/projected/66a5f13c-92bf-4e0b-ac5e-f14de7c6f72a-kube-api-access-t7gvw\") pod \"control-plane-machine-set-operator-78cbb6b69f-xd62p\" (UID: \"66a5f13c-92bf-4e0b-ac5e-f14de7c6f72a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xd62p" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.161612 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72gzx\" (UniqueName: \"kubernetes.io/projected/7f9945a5-74c6-482d-b4c4-1097e3efebe0-kube-api-access-72gzx\") pod \"openshift-config-operator-7777fb866f-vt2mt\" (UID: \"7f9945a5-74c6-482d-b4c4-1097e3efebe0\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.169309 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.174056 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dfsl\" (UniqueName: \"kubernetes.io/projected/68de0039-27ee-4f02-987f-9105c69c355f-kube-api-access-6dfsl\") pod \"multus-admission-controller-857f4d67dd-gtbds\" (UID: \"68de0039-27ee-4f02-987f-9105c69c355f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gtbds" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.175067 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.184799 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mq564"] Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.190069 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.193129 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de2ea5bf-12b8-4ab0-a073-53df41c9646a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-427rj\" (UID: \"de2ea5bf-12b8-4ab0-a073-53df41c9646a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-427rj" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.219188 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff15a729-bbdf-4422-8db2-0e429e76ee25-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-2g9zb\" (UID: \"ff15a729-bbdf-4422-8db2-0e429e76ee25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.232127 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6f87\" (UniqueName: \"kubernetes.io/projected/e5131de2-27e3-4ec2-8a59-7cc11f6865be-kube-api-access-x6f87\") pod \"catalog-operator-68c6474976-gcx4m\" (UID: \"e5131de2-27e3-4ec2-8a59-7cc11f6865be\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.257760 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfmln\" (UniqueName: \"kubernetes.io/projected/6eeaccaa-9989-4a5c-91ed-67dfb7319e14-kube-api-access-vfmln\") pod \"machine-approver-56656f9798-sjjvc\" (UID: \"6eeaccaa-9989-4a5c-91ed-67dfb7319e14\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.260604 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ldnm9" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.278033 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-427rj" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.284690 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gtbds" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.290238 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5vhz\" (UniqueName: \"kubernetes.io/projected/6bbf8e83-d583-49de-a12c-6f0a3953dc67-kube-api-access-q5vhz\") pod \"router-default-5444994796-kpmwl\" (UID: \"6bbf8e83-d583-49de-a12c-6f0a3953dc67\") " pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.291568 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.297079 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7r28\" (UniqueName: \"kubernetes.io/projected/2279b4a5-d4f2-455a-87e2-0774d6d57c40-kube-api-access-l7r28\") pod \"authentication-operator-69f744f599-ltqv2\" (UID: \"2279b4a5-d4f2-455a-87e2-0774d6d57c40\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.318121 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xd62p" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.320402 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsbqs\" (UniqueName: \"kubernetes.io/projected/d91d282e-a4f3-4bc9-9623-4640774f641a-kube-api-access-gsbqs\") pod \"image-pruner-29551680-hclh4\" (UID: \"d91d282e-a4f3-4bc9-9623-4640774f641a\") " pod="openshift-image-registry/image-pruner-29551680-hclh4" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.335067 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hs46\" (UniqueName: \"kubernetes.io/projected/1c418134-7dc2-42f5-b1ce-c22a2b77a4b9-kube-api-access-6hs46\") pod \"downloads-7954f5f757-zkqrr\" (UID: \"1c418134-7dc2-42f5-b1ce-c22a2b77a4b9\") " pod="openshift-console/downloads-7954f5f757-zkqrr" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.340456 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kzklf"] Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.352009 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcfd4\" (UniqueName: \"kubernetes.io/projected/ff15a729-bbdf-4422-8db2-0e429e76ee25-kube-api-access-fcfd4\") pod \"cluster-image-registry-operator-dc59b4c8b-2g9zb\" (UID: \"ff15a729-bbdf-4422-8db2-0e429e76ee25\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.374695 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz5k6\" (UniqueName: \"kubernetes.io/projected/9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4-kube-api-access-sz5k6\") pod \"etcd-operator-b45778765-qfc95\" (UID: \"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.409997 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b8bt\" (UniqueName: \"kubernetes.io/projected/376340e6-76cd-476b-b86b-350cb9edfecf-kube-api-access-4b8bt\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxgxz\" (UID: \"376340e6-76cd-476b-b86b-350cb9edfecf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxgxz" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.420570 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc8ebb5d-19af-4da1-8767-4b901d1fef8a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2cxm6\" (UID: \"dc8ebb5d-19af-4da1-8767-4b901d1fef8a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.433474 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kxtw\" (UniqueName: \"kubernetes.io/projected/cff04257-cb6e-44cf-93de-f4e2cdf6698d-kube-api-access-2kxtw\") pod \"dns-operator-744455d44c-qnppg\" (UID: \"cff04257-cb6e-44cf-93de-f4e2cdf6698d\") " pod="openshift-dns-operator/dns-operator-744455d44c-qnppg" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.454123 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/06401874-a064-42cf-b1db-0e5bff007b1c-srv-cert\") pod \"olm-operator-6b444d44fb-wcpw2\" (UID: \"06401874-a064-42cf-b1db-0e5bff007b1c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.454173 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/06401874-a064-42cf-b1db-0e5bff007b1c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wcpw2\" (UID: \"06401874-a064-42cf-b1db-0e5bff007b1c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.458425 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.458498 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/47ee6fa1-0ef0-414f-91af-0f170e94c390-ca-trust-extracted\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.458545 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfkfp\" (UniqueName: \"kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-kube-api-access-rfkfp\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.458574 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/47ee6fa1-0ef0-414f-91af-0f170e94c390-installation-pull-secrets\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.458852 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-bound-sa-token\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.458926 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-registry-tls\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.458954 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47ee6fa1-0ef0-414f-91af-0f170e94c390-trusted-ca\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.459028 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v7wd\" (UniqueName: \"kubernetes.io/projected/06401874-a064-42cf-b1db-0e5bff007b1c-kube-api-access-2v7wd\") pod \"olm-operator-6b444d44fb-wcpw2\" (UID: \"06401874-a064-42cf-b1db-0e5bff007b1c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.459092 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/47ee6fa1-0ef0-414f-91af-0f170e94c390-registry-certificates\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: E0310 00:09:10.459214 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:10.959191251 +0000 UTC m=+177.107086363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.465217 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b492s"] Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.484288 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.498065 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.498530 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-t5bxf"] Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.503281 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zkqrr" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.512062 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxgxz" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.519213 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29551680-hclh4" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.527776 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.538129 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.548076 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qnppg" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.553118 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.560364 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.561466 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21590dce-fca3-4e23-8fd7-ade8be24206c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgvms\" (UID: \"21590dce-fca3-4e23-8fd7-ade8be24206c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgvms" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.561501 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1412799e-cd89-42f5-895a-76031f913a55-node-bootstrap-token\") pod \"machine-config-server-9rn4f\" (UID: \"1412799e-cd89-42f5-895a-76031f913a55\") " pod="openshift-machine-config-operator/machine-config-server-9rn4f" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.561531 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-socket-dir\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.561610 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-647km\" (UniqueName: \"kubernetes.io/projected/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-kube-api-access-647km\") pod \"collect-profiles-29551680-7lx2x\" (UID: \"0492b9b7-f88e-48ee-88e6-83aa55d8ce65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.561676 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7jhp\" (UniqueName: \"kubernetes.io/projected/a46863f6-02e5-4d35-8b59-216377e41403-kube-api-access-p7jhp\") pod \"marketplace-operator-79b997595-vw9rg\" (UID: \"a46863f6-02e5-4d35-8b59-216377e41403\") " pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.561698 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/332da74f-063b-48aa-8b86-8646fddcadda-apiservice-cert\") pod \"packageserver-d55dfcdfc-rlnd7\" (UID: \"332da74f-063b-48aa-8b86-8646fddcadda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.561769 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-registration-dir\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.561788 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g5gk\" (UniqueName: \"kubernetes.io/projected/6b23a6fe-6dcc-4d84-8591-31079d563929-kube-api-access-6g5gk\") pod \"machine-config-operator-74547568cd-qgdqx\" (UID: \"6b23a6fe-6dcc-4d84-8591-31079d563929\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.561806 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-mountpoint-dir\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.561841 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e154434-b989-4968-be1d-7c9982d34241-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nkxsv\" (UID: \"5e154434-b989-4968-be1d-7c9982d34241\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nkxsv" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.561861 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6b23a6fe-6dcc-4d84-8591-31079d563929-images\") pod \"machine-config-operator-74547568cd-qgdqx\" (UID: \"6b23a6fe-6dcc-4d84-8591-31079d563929\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.561896 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/47ee6fa1-0ef0-414f-91af-0f170e94c390-ca-trust-extracted\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.561916 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfkfp\" (UniqueName: \"kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-kube-api-access-rfkfp\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.561939 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cd304b2a-9e83-4f93-9f48-c88f6ab93b16-signing-key\") pod \"service-ca-9c57cc56f-xzvmk\" (UID: \"cd304b2a-9e83-4f93-9f48-c88f6ab93b16\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzvmk" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.561969 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21590dce-fca3-4e23-8fd7-ade8be24206c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgvms\" (UID: \"21590dce-fca3-4e23-8fd7-ade8be24206c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgvms" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562005 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbb2x\" (UniqueName: \"kubernetes.io/projected/380fb2b7-9fcf-4c1a-879a-a4a6195f4e58-kube-api-access-qbb2x\") pod \"ingress-canary-b4rmk\" (UID: \"380fb2b7-9fcf-4c1a-879a-a4a6195f4e58\") " pod="openshift-ingress-canary/ingress-canary-b4rmk" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562101 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/47ee6fa1-0ef0-414f-91af-0f170e94c390-installation-pull-secrets\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562120 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a46863f6-02e5-4d35-8b59-216377e41403-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vw9rg\" (UID: \"a46863f6-02e5-4d35-8b59-216377e41403\") " pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562182 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxqgz\" (UniqueName: \"kubernetes.io/projected/220fec7e-5622-4a55-ab08-359de5d87c6f-kube-api-access-qxqgz\") pod \"dns-default-j4vpr\" (UID: \"220fec7e-5622-4a55-ab08-359de5d87c6f\") " pod="openshift-dns/dns-default-j4vpr" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562255 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/793885bc-abbf-44be-8e32-b2088cf449bd-proxy-tls\") pod \"machine-config-controller-84d6567774-mnn4p\" (UID: \"793885bc-abbf-44be-8e32-b2088cf449bd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562285 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e154434-b989-4968-be1d-7c9982d34241-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nkxsv\" (UID: \"5e154434-b989-4968-be1d-7c9982d34241\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nkxsv" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562301 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/380fb2b7-9fcf-4c1a-879a-a4a6195f4e58-cert\") pod \"ingress-canary-b4rmk\" (UID: \"380fb2b7-9fcf-4c1a-879a-a4a6195f4e58\") " pod="openshift-ingress-canary/ingress-canary-b4rmk" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562346 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nbzf\" (UniqueName: \"kubernetes.io/projected/094c6270-b610-42c0-a6ce-3c146cb6bb6c-kube-api-access-2nbzf\") pod \"auto-csr-approver-29551688-fkkqj\" (UID: \"094c6270-b610-42c0-a6ce-3c146cb6bb6c\") " pod="openshift-infra/auto-csr-approver-29551688-fkkqj" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562370 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/332da74f-063b-48aa-8b86-8646fddcadda-webhook-cert\") pod \"packageserver-d55dfcdfc-rlnd7\" (UID: \"332da74f-063b-48aa-8b86-8646fddcadda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" Mar 10 00:09:10 crc kubenswrapper[4906]: E0310 00:09:10.562395 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:11.062371409 +0000 UTC m=+177.210266521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562416 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-plugins-dir\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562468 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-bound-sa-token\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562501 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/220fec7e-5622-4a55-ab08-359de5d87c6f-config-volume\") pod \"dns-default-j4vpr\" (UID: \"220fec7e-5622-4a55-ab08-359de5d87c6f\") " pod="openshift-dns/dns-default-j4vpr" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562612 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/851d4efd-45ba-4f03-bfa6-75bde8aae278-config\") pod \"service-ca-operator-777779d784-fvmnw\" (UID: \"851d4efd-45ba-4f03-bfa6-75bde8aae278\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fvmnw" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562765 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b23a6fe-6dcc-4d84-8591-31079d563929-proxy-tls\") pod \"machine-config-operator-74547568cd-qgdqx\" (UID: \"6b23a6fe-6dcc-4d84-8591-31079d563929\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562788 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-config-volume\") pod \"collect-profiles-29551680-7lx2x\" (UID: \"0492b9b7-f88e-48ee-88e6-83aa55d8ce65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562806 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/851d4efd-45ba-4f03-bfa6-75bde8aae278-serving-cert\") pod \"service-ca-operator-777779d784-fvmnw\" (UID: \"851d4efd-45ba-4f03-bfa6-75bde8aae278\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fvmnw" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562851 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cd304b2a-9e83-4f93-9f48-c88f6ab93b16-signing-cabundle\") pod \"service-ca-9c57cc56f-xzvmk\" (UID: \"cd304b2a-9e83-4f93-9f48-c88f6ab93b16\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzvmk" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562885 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n46qd\" (UniqueName: \"kubernetes.io/projected/851d4efd-45ba-4f03-bfa6-75bde8aae278-kube-api-access-n46qd\") pod \"service-ca-operator-777779d784-fvmnw\" (UID: \"851d4efd-45ba-4f03-bfa6-75bde8aae278\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fvmnw" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562916 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz7dd\" (UniqueName: \"kubernetes.io/projected/d366e911-15d3-4899-b843-fd66f0a9416b-kube-api-access-rz7dd\") pod \"migrator-59844c95c7-kfw9l\" (UID: \"d366e911-15d3-4899-b843-fd66f0a9416b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kfw9l" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562934 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5tpj\" (UniqueName: \"kubernetes.io/projected/cd304b2a-9e83-4f93-9f48-c88f6ab93b16-kube-api-access-w5tpj\") pod \"service-ca-9c57cc56f-xzvmk\" (UID: \"cd304b2a-9e83-4f93-9f48-c88f6ab93b16\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzvmk" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562967 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-registry-tls\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.562987 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-csi-data-dir\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.563051 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47ee6fa1-0ef0-414f-91af-0f170e94c390-trusted-ca\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.563211 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxwwl\" (UniqueName: \"kubernetes.io/projected/332da74f-063b-48aa-8b86-8646fddcadda-kube-api-access-bxwwl\") pod \"packageserver-d55dfcdfc-rlnd7\" (UID: \"332da74f-063b-48aa-8b86-8646fddcadda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.563286 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6b23a6fe-6dcc-4d84-8591-31079d563929-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qgdqx\" (UID: \"6b23a6fe-6dcc-4d84-8591-31079d563929\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.563306 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt7b9\" (UniqueName: \"kubernetes.io/projected/793885bc-abbf-44be-8e32-b2088cf449bd-kube-api-access-zt7b9\") pod \"machine-config-controller-84d6567774-mnn4p\" (UID: \"793885bc-abbf-44be-8e32-b2088cf449bd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.563351 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v7wd\" (UniqueName: \"kubernetes.io/projected/06401874-a064-42cf-b1db-0e5bff007b1c-kube-api-access-2v7wd\") pod \"olm-operator-6b444d44fb-wcpw2\" (UID: \"06401874-a064-42cf-b1db-0e5bff007b1c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.563373 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9b04397-d728-4bd8-a4ec-55bb6ebde285-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-krjqb\" (UID: \"d9b04397-d728-4bd8-a4ec-55bb6ebde285\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-krjqb" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.563392 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/47ee6fa1-0ef0-414f-91af-0f170e94c390-registry-certificates\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.563412 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjgc7\" (UniqueName: \"kubernetes.io/projected/1412799e-cd89-42f5-895a-76031f913a55-kube-api-access-hjgc7\") pod \"machine-config-server-9rn4f\" (UID: \"1412799e-cd89-42f5-895a-76031f913a55\") " pod="openshift-machine-config-operator/machine-config-server-9rn4f" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.563470 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9b04397-d728-4bd8-a4ec-55bb6ebde285-config\") pod \"kube-apiserver-operator-766d6c64bb-krjqb\" (UID: \"d9b04397-d728-4bd8-a4ec-55bb6ebde285\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-krjqb" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.563535 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/332da74f-063b-48aa-8b86-8646fddcadda-tmpfs\") pod \"packageserver-d55dfcdfc-rlnd7\" (UID: \"332da74f-063b-48aa-8b86-8646fddcadda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.569383 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.571109 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/47ee6fa1-0ef0-414f-91af-0f170e94c390-ca-trust-extracted\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.571816 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmlrh\" (UniqueName: \"kubernetes.io/projected/5e154434-b989-4968-be1d-7c9982d34241-kube-api-access-kmlrh\") pod \"kube-storage-version-migrator-operator-b67b599dd-nkxsv\" (UID: \"5e154434-b989-4968-be1d-7c9982d34241\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nkxsv" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.571972 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9b04397-d728-4bd8-a4ec-55bb6ebde285-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-krjqb\" (UID: \"d9b04397-d728-4bd8-a4ec-55bb6ebde285\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-krjqb" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.572020 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/793885bc-abbf-44be-8e32-b2088cf449bd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mnn4p\" (UID: \"793885bc-abbf-44be-8e32-b2088cf449bd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.572070 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6599z\" (UniqueName: \"kubernetes.io/projected/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-kube-api-access-6599z\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.572139 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/06401874-a064-42cf-b1db-0e5bff007b1c-srv-cert\") pod \"olm-operator-6b444d44fb-wcpw2\" (UID: \"06401874-a064-42cf-b1db-0e5bff007b1c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.572175 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/06401874-a064-42cf-b1db-0e5bff007b1c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wcpw2\" (UID: \"06401874-a064-42cf-b1db-0e5bff007b1c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.572203 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a46863f6-02e5-4d35-8b59-216377e41403-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vw9rg\" (UID: \"a46863f6-02e5-4d35-8b59-216377e41403\") " pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.573489 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/47ee6fa1-0ef0-414f-91af-0f170e94c390-installation-pull-secrets\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.574380 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/47ee6fa1-0ef0-414f-91af-0f170e94c390-registry-certificates\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.576070 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1412799e-cd89-42f5-895a-76031f913a55-certs\") pod \"machine-config-server-9rn4f\" (UID: \"1412799e-cd89-42f5-895a-76031f913a55\") " pod="openshift-machine-config-operator/machine-config-server-9rn4f" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.576134 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.576532 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/220fec7e-5622-4a55-ab08-359de5d87c6f-metrics-tls\") pod \"dns-default-j4vpr\" (UID: \"220fec7e-5622-4a55-ab08-359de5d87c6f\") " pod="openshift-dns/dns-default-j4vpr" Mar 10 00:09:10 crc kubenswrapper[4906]: E0310 00:09:10.576575 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:11.07655829 +0000 UTC m=+177.224453402 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.589857 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47ee6fa1-0ef0-414f-91af-0f170e94c390-trusted-ca\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.591243 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/06401874-a064-42cf-b1db-0e5bff007b1c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wcpw2\" (UID: \"06401874-a064-42cf-b1db-0e5bff007b1c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.591404 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/06401874-a064-42cf-b1db-0e5bff007b1c-srv-cert\") pod \"olm-operator-6b444d44fb-wcpw2\" (UID: \"06401874-a064-42cf-b1db-0e5bff007b1c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.595202 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-registry-tls\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.595449 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21590dce-fca3-4e23-8fd7-ade8be24206c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgvms\" (UID: \"21590dce-fca3-4e23-8fd7-ade8be24206c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgvms" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.595583 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-secret-volume\") pod \"collect-profiles-29551680-7lx2x\" (UID: \"0492b9b7-f88e-48ee-88e6-83aa55d8ce65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.611216 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v7wd\" (UniqueName: \"kubernetes.io/projected/06401874-a064-42cf-b1db-0e5bff007b1c-kube-api-access-2v7wd\") pod \"olm-operator-6b444d44fb-wcpw2\" (UID: \"06401874-a064-42cf-b1db-0e5bff007b1c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.611255 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-bound-sa-token\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.632200 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v25gg"] Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.632248 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x"] Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.637795 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ldnm9"] Mar 10 00:09:10 crc kubenswrapper[4906]: W0310 00:09:10.642033 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eeaccaa_9989_4a5c_91ed_67dfb7319e14.slice/crio-b384e27559a547bca67a8463fe608a6dd0463525229399d5baf03d1ec5a522d0 WatchSource:0}: Error finding container b384e27559a547bca67a8463fe608a6dd0463525229399d5baf03d1ec5a522d0: Status 404 returned error can't find the container with id b384e27559a547bca67a8463fe608a6dd0463525229399d5baf03d1ec5a522d0 Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.655677 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfkfp\" (UniqueName: \"kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-kube-api-access-rfkfp\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.697031 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.697265 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-mountpoint-dir\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.697293 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e154434-b989-4968-be1d-7c9982d34241-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nkxsv\" (UID: \"5e154434-b989-4968-be1d-7c9982d34241\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nkxsv" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.697314 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6b23a6fe-6dcc-4d84-8591-31079d563929-images\") pod \"machine-config-operator-74547568cd-qgdqx\" (UID: \"6b23a6fe-6dcc-4d84-8591-31079d563929\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.697333 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cd304b2a-9e83-4f93-9f48-c88f6ab93b16-signing-key\") pod \"service-ca-9c57cc56f-xzvmk\" (UID: \"cd304b2a-9e83-4f93-9f48-c88f6ab93b16\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzvmk" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.697352 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21590dce-fca3-4e23-8fd7-ade8be24206c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgvms\" (UID: \"21590dce-fca3-4e23-8fd7-ade8be24206c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgvms" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698438 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbb2x\" (UniqueName: \"kubernetes.io/projected/380fb2b7-9fcf-4c1a-879a-a4a6195f4e58-kube-api-access-qbb2x\") pod \"ingress-canary-b4rmk\" (UID: \"380fb2b7-9fcf-4c1a-879a-a4a6195f4e58\") " pod="openshift-ingress-canary/ingress-canary-b4rmk" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698473 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a46863f6-02e5-4d35-8b59-216377e41403-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vw9rg\" (UID: \"a46863f6-02e5-4d35-8b59-216377e41403\") " pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698495 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxqgz\" (UniqueName: \"kubernetes.io/projected/220fec7e-5622-4a55-ab08-359de5d87c6f-kube-api-access-qxqgz\") pod \"dns-default-j4vpr\" (UID: \"220fec7e-5622-4a55-ab08-359de5d87c6f\") " pod="openshift-dns/dns-default-j4vpr" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698525 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/793885bc-abbf-44be-8e32-b2088cf449bd-proxy-tls\") pod \"machine-config-controller-84d6567774-mnn4p\" (UID: \"793885bc-abbf-44be-8e32-b2088cf449bd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698542 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e154434-b989-4968-be1d-7c9982d34241-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nkxsv\" (UID: \"5e154434-b989-4968-be1d-7c9982d34241\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nkxsv" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698563 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/380fb2b7-9fcf-4c1a-879a-a4a6195f4e58-cert\") pod \"ingress-canary-b4rmk\" (UID: \"380fb2b7-9fcf-4c1a-879a-a4a6195f4e58\") " pod="openshift-ingress-canary/ingress-canary-b4rmk" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698583 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nbzf\" (UniqueName: \"kubernetes.io/projected/094c6270-b610-42c0-a6ce-3c146cb6bb6c-kube-api-access-2nbzf\") pod \"auto-csr-approver-29551688-fkkqj\" (UID: \"094c6270-b610-42c0-a6ce-3c146cb6bb6c\") " pod="openshift-infra/auto-csr-approver-29551688-fkkqj" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698611 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/332da74f-063b-48aa-8b86-8646fddcadda-webhook-cert\") pod \"packageserver-d55dfcdfc-rlnd7\" (UID: \"332da74f-063b-48aa-8b86-8646fddcadda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698627 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-plugins-dir\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698661 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/220fec7e-5622-4a55-ab08-359de5d87c6f-config-volume\") pod \"dns-default-j4vpr\" (UID: \"220fec7e-5622-4a55-ab08-359de5d87c6f\") " pod="openshift-dns/dns-default-j4vpr" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698697 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/851d4efd-45ba-4f03-bfa6-75bde8aae278-config\") pod \"service-ca-operator-777779d784-fvmnw\" (UID: \"851d4efd-45ba-4f03-bfa6-75bde8aae278\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fvmnw" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698717 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b23a6fe-6dcc-4d84-8591-31079d563929-proxy-tls\") pod \"machine-config-operator-74547568cd-qgdqx\" (UID: \"6b23a6fe-6dcc-4d84-8591-31079d563929\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698736 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-config-volume\") pod \"collect-profiles-29551680-7lx2x\" (UID: \"0492b9b7-f88e-48ee-88e6-83aa55d8ce65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698758 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/851d4efd-45ba-4f03-bfa6-75bde8aae278-serving-cert\") pod \"service-ca-operator-777779d784-fvmnw\" (UID: \"851d4efd-45ba-4f03-bfa6-75bde8aae278\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fvmnw" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698779 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cd304b2a-9e83-4f93-9f48-c88f6ab93b16-signing-cabundle\") pod \"service-ca-9c57cc56f-xzvmk\" (UID: \"cd304b2a-9e83-4f93-9f48-c88f6ab93b16\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzvmk" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698798 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n46qd\" (UniqueName: \"kubernetes.io/projected/851d4efd-45ba-4f03-bfa6-75bde8aae278-kube-api-access-n46qd\") pod \"service-ca-operator-777779d784-fvmnw\" (UID: \"851d4efd-45ba-4f03-bfa6-75bde8aae278\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fvmnw" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698816 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz7dd\" (UniqueName: \"kubernetes.io/projected/d366e911-15d3-4899-b843-fd66f0a9416b-kube-api-access-rz7dd\") pod \"migrator-59844c95c7-kfw9l\" (UID: \"d366e911-15d3-4899-b843-fd66f0a9416b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kfw9l" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698848 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5tpj\" (UniqueName: \"kubernetes.io/projected/cd304b2a-9e83-4f93-9f48-c88f6ab93b16-kube-api-access-w5tpj\") pod \"service-ca-9c57cc56f-xzvmk\" (UID: \"cd304b2a-9e83-4f93-9f48-c88f6ab93b16\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzvmk" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698877 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-csi-data-dir\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698925 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxwwl\" (UniqueName: \"kubernetes.io/projected/332da74f-063b-48aa-8b86-8646fddcadda-kube-api-access-bxwwl\") pod \"packageserver-d55dfcdfc-rlnd7\" (UID: \"332da74f-063b-48aa-8b86-8646fddcadda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698947 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6b23a6fe-6dcc-4d84-8591-31079d563929-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qgdqx\" (UID: \"6b23a6fe-6dcc-4d84-8591-31079d563929\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698965 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt7b9\" (UniqueName: \"kubernetes.io/projected/793885bc-abbf-44be-8e32-b2088cf449bd-kube-api-access-zt7b9\") pod \"machine-config-controller-84d6567774-mnn4p\" (UID: \"793885bc-abbf-44be-8e32-b2088cf449bd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.698986 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9b04397-d728-4bd8-a4ec-55bb6ebde285-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-krjqb\" (UID: \"d9b04397-d728-4bd8-a4ec-55bb6ebde285\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-krjqb" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699008 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjgc7\" (UniqueName: \"kubernetes.io/projected/1412799e-cd89-42f5-895a-76031f913a55-kube-api-access-hjgc7\") pod \"machine-config-server-9rn4f\" (UID: \"1412799e-cd89-42f5-895a-76031f913a55\") " pod="openshift-machine-config-operator/machine-config-server-9rn4f" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699028 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9b04397-d728-4bd8-a4ec-55bb6ebde285-config\") pod \"kube-apiserver-operator-766d6c64bb-krjqb\" (UID: \"d9b04397-d728-4bd8-a4ec-55bb6ebde285\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-krjqb" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699057 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmlrh\" (UniqueName: \"kubernetes.io/projected/5e154434-b989-4968-be1d-7c9982d34241-kube-api-access-kmlrh\") pod \"kube-storage-version-migrator-operator-b67b599dd-nkxsv\" (UID: \"5e154434-b989-4968-be1d-7c9982d34241\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nkxsv" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699074 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/332da74f-063b-48aa-8b86-8646fddcadda-tmpfs\") pod \"packageserver-d55dfcdfc-rlnd7\" (UID: \"332da74f-063b-48aa-8b86-8646fddcadda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699101 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9b04397-d728-4bd8-a4ec-55bb6ebde285-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-krjqb\" (UID: \"d9b04397-d728-4bd8-a4ec-55bb6ebde285\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-krjqb" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699118 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/793885bc-abbf-44be-8e32-b2088cf449bd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mnn4p\" (UID: \"793885bc-abbf-44be-8e32-b2088cf449bd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699145 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6599z\" (UniqueName: \"kubernetes.io/projected/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-kube-api-access-6599z\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699167 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a46863f6-02e5-4d35-8b59-216377e41403-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vw9rg\" (UID: \"a46863f6-02e5-4d35-8b59-216377e41403\") " pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699194 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1412799e-cd89-42f5-895a-76031f913a55-certs\") pod \"machine-config-server-9rn4f\" (UID: \"1412799e-cd89-42f5-895a-76031f913a55\") " pod="openshift-machine-config-operator/machine-config-server-9rn4f" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699218 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/220fec7e-5622-4a55-ab08-359de5d87c6f-metrics-tls\") pod \"dns-default-j4vpr\" (UID: \"220fec7e-5622-4a55-ab08-359de5d87c6f\") " pod="openshift-dns/dns-default-j4vpr" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699247 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21590dce-fca3-4e23-8fd7-ade8be24206c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgvms\" (UID: \"21590dce-fca3-4e23-8fd7-ade8be24206c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgvms" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699266 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-secret-volume\") pod \"collect-profiles-29551680-7lx2x\" (UID: \"0492b9b7-f88e-48ee-88e6-83aa55d8ce65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699285 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21590dce-fca3-4e23-8fd7-ade8be24206c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgvms\" (UID: \"21590dce-fca3-4e23-8fd7-ade8be24206c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgvms" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699323 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1412799e-cd89-42f5-895a-76031f913a55-node-bootstrap-token\") pod \"machine-config-server-9rn4f\" (UID: \"1412799e-cd89-42f5-895a-76031f913a55\") " pod="openshift-machine-config-operator/machine-config-server-9rn4f" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699341 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-socket-dir\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699360 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-647km\" (UniqueName: \"kubernetes.io/projected/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-kube-api-access-647km\") pod \"collect-profiles-29551680-7lx2x\" (UID: \"0492b9b7-f88e-48ee-88e6-83aa55d8ce65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699378 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7jhp\" (UniqueName: \"kubernetes.io/projected/a46863f6-02e5-4d35-8b59-216377e41403-kube-api-access-p7jhp\") pod \"marketplace-operator-79b997595-vw9rg\" (UID: \"a46863f6-02e5-4d35-8b59-216377e41403\") " pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699394 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/332da74f-063b-48aa-8b86-8646fddcadda-apiservice-cert\") pod \"packageserver-d55dfcdfc-rlnd7\" (UID: \"332da74f-063b-48aa-8b86-8646fddcadda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699412 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-registration-dir\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699447 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g5gk\" (UniqueName: \"kubernetes.io/projected/6b23a6fe-6dcc-4d84-8591-31079d563929-kube-api-access-6g5gk\") pod \"machine-config-operator-74547568cd-qgdqx\" (UID: \"6b23a6fe-6dcc-4d84-8591-31079d563929\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.699761 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-plugins-dir\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.700821 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cd304b2a-9e83-4f93-9f48-c88f6ab93b16-signing-cabundle\") pod \"service-ca-9c57cc56f-xzvmk\" (UID: \"cd304b2a-9e83-4f93-9f48-c88f6ab93b16\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzvmk" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.701086 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-csi-data-dir\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:10 crc kubenswrapper[4906]: E0310 00:09:10.701296 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:11.201270966 +0000 UTC m=+177.349166078 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.701362 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-mountpoint-dir\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.701414 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/220fec7e-5622-4a55-ab08-359de5d87c6f-config-volume\") pod \"dns-default-j4vpr\" (UID: \"220fec7e-5622-4a55-ab08-359de5d87c6f\") " pod="openshift-dns/dns-default-j4vpr" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.702947 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-socket-dir\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.706312 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-registration-dir\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.706946 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e154434-b989-4968-be1d-7c9982d34241-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nkxsv\" (UID: \"5e154434-b989-4968-be1d-7c9982d34241\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nkxsv" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.707474 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6b23a6fe-6dcc-4d84-8591-31079d563929-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qgdqx\" (UID: \"6b23a6fe-6dcc-4d84-8591-31079d563929\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.713271 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9b04397-d728-4bd8-a4ec-55bb6ebde285-config\") pod \"kube-apiserver-operator-766d6c64bb-krjqb\" (UID: \"d9b04397-d728-4bd8-a4ec-55bb6ebde285\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-krjqb" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.713696 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6b23a6fe-6dcc-4d84-8591-31079d563929-images\") pod \"machine-config-operator-74547568cd-qgdqx\" (UID: \"6b23a6fe-6dcc-4d84-8591-31079d563929\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.714435 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21590dce-fca3-4e23-8fd7-ade8be24206c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgvms\" (UID: \"21590dce-fca3-4e23-8fd7-ade8be24206c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgvms" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.714846 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/332da74f-063b-48aa-8b86-8646fddcadda-tmpfs\") pod \"packageserver-d55dfcdfc-rlnd7\" (UID: \"332da74f-063b-48aa-8b86-8646fddcadda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.715219 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/851d4efd-45ba-4f03-bfa6-75bde8aae278-config\") pod \"service-ca-operator-777779d784-fvmnw\" (UID: \"851d4efd-45ba-4f03-bfa6-75bde8aae278\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fvmnw" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.715757 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/851d4efd-45ba-4f03-bfa6-75bde8aae278-serving-cert\") pod \"service-ca-operator-777779d784-fvmnw\" (UID: \"851d4efd-45ba-4f03-bfa6-75bde8aae278\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fvmnw" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.715770 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a46863f6-02e5-4d35-8b59-216377e41403-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vw9rg\" (UID: \"a46863f6-02e5-4d35-8b59-216377e41403\") " pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.716120 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/793885bc-abbf-44be-8e32-b2088cf449bd-proxy-tls\") pod \"machine-config-controller-84d6567774-mnn4p\" (UID: \"793885bc-abbf-44be-8e32-b2088cf449bd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.716336 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9b04397-d728-4bd8-a4ec-55bb6ebde285-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-krjqb\" (UID: \"d9b04397-d728-4bd8-a4ec-55bb6ebde285\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-krjqb" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.721723 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b23a6fe-6dcc-4d84-8591-31079d563929-proxy-tls\") pod \"machine-config-operator-74547568cd-qgdqx\" (UID: \"6b23a6fe-6dcc-4d84-8591-31079d563929\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.723238 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/332da74f-063b-48aa-8b86-8646fddcadda-apiservice-cert\") pod \"packageserver-d55dfcdfc-rlnd7\" (UID: \"332da74f-063b-48aa-8b86-8646fddcadda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.723356 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e154434-b989-4968-be1d-7c9982d34241-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nkxsv\" (UID: \"5e154434-b989-4968-be1d-7c9982d34241\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nkxsv" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.723947 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1412799e-cd89-42f5-895a-76031f913a55-node-bootstrap-token\") pod \"machine-config-server-9rn4f\" (UID: \"1412799e-cd89-42f5-895a-76031f913a55\") " pod="openshift-machine-config-operator/machine-config-server-9rn4f" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.724381 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-secret-volume\") pod \"collect-profiles-29551680-7lx2x\" (UID: \"0492b9b7-f88e-48ee-88e6-83aa55d8ce65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.724996 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a46863f6-02e5-4d35-8b59-216377e41403-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vw9rg\" (UID: \"a46863f6-02e5-4d35-8b59-216377e41403\") " pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.725774 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cd304b2a-9e83-4f93-9f48-c88f6ab93b16-signing-key\") pod \"service-ca-9c57cc56f-xzvmk\" (UID: \"cd304b2a-9e83-4f93-9f48-c88f6ab93b16\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzvmk" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.725980 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-config-volume\") pod \"collect-profiles-29551680-7lx2x\" (UID: \"0492b9b7-f88e-48ee-88e6-83aa55d8ce65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.726085 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/332da74f-063b-48aa-8b86-8646fddcadda-webhook-cert\") pod \"packageserver-d55dfcdfc-rlnd7\" (UID: \"332da74f-063b-48aa-8b86-8646fddcadda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.726274 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21590dce-fca3-4e23-8fd7-ade8be24206c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgvms\" (UID: \"21590dce-fca3-4e23-8fd7-ade8be24206c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgvms" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.726971 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1412799e-cd89-42f5-895a-76031f913a55-certs\") pod \"machine-config-server-9rn4f\" (UID: \"1412799e-cd89-42f5-895a-76031f913a55\") " pod="openshift-machine-config-operator/machine-config-server-9rn4f" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.729279 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/220fec7e-5622-4a55-ab08-359de5d87c6f-metrics-tls\") pod \"dns-default-j4vpr\" (UID: \"220fec7e-5622-4a55-ab08-359de5d87c6f\") " pod="openshift-dns/dns-default-j4vpr" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.729491 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/380fb2b7-9fcf-4c1a-879a-a4a6195f4e58-cert\") pod \"ingress-canary-b4rmk\" (UID: \"380fb2b7-9fcf-4c1a-879a-a4a6195f4e58\") " pod="openshift-ingress-canary/ingress-canary-b4rmk" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.738526 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt"] Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.748144 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/793885bc-abbf-44be-8e32-b2088cf449bd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mnn4p\" (UID: \"793885bc-abbf-44be-8e32-b2088cf449bd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.760469 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt7b9\" (UniqueName: \"kubernetes.io/projected/793885bc-abbf-44be-8e32-b2088cf449bd-kube-api-access-zt7b9\") pod \"machine-config-controller-84d6567774-mnn4p\" (UID: \"793885bc-abbf-44be-8e32-b2088cf449bd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.769704 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt"] Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.769915 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g5gk\" (UniqueName: \"kubernetes.io/projected/6b23a6fe-6dcc-4d84-8591-31079d563929-kube-api-access-6g5gk\") pod \"machine-config-operator-74547568cd-qgdqx\" (UID: \"6b23a6fe-6dcc-4d84-8591-31079d563929\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.787012 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5tpj\" (UniqueName: \"kubernetes.io/projected/cd304b2a-9e83-4f93-9f48-c88f6ab93b16-kube-api-access-w5tpj\") pod \"service-ca-9c57cc56f-xzvmk\" (UID: \"cd304b2a-9e83-4f93-9f48-c88f6ab93b16\") " pod="openshift-service-ca/service-ca-9c57cc56f-xzvmk" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.798843 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n46qd\" (UniqueName: \"kubernetes.io/projected/851d4efd-45ba-4f03-bfa6-75bde8aae278-kube-api-access-n46qd\") pod \"service-ca-operator-777779d784-fvmnw\" (UID: \"851d4efd-45ba-4f03-bfa6-75bde8aae278\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fvmnw" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.804546 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xd62p"] Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.821282 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:10 crc kubenswrapper[4906]: E0310 00:09:10.822870 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:11.322855864 +0000 UTC m=+177.470750976 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.859277 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m"] Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.866595 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbb2x\" (UniqueName: \"kubernetes.io/projected/380fb2b7-9fcf-4c1a-879a-a4a6195f4e58-kube-api-access-qbb2x\") pod \"ingress-canary-b4rmk\" (UID: \"380fb2b7-9fcf-4c1a-879a-a4a6195f4e58\") " pod="openshift-ingress-canary/ingress-canary-b4rmk" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.867402 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz7dd\" (UniqueName: \"kubernetes.io/projected/d366e911-15d3-4899-b843-fd66f0a9416b-kube-api-access-rz7dd\") pod \"migrator-59844c95c7-kfw9l\" (UID: \"d366e911-15d3-4899-b843-fd66f0a9416b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kfw9l" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.870391 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxwwl\" (UniqueName: \"kubernetes.io/projected/332da74f-063b-48aa-8b86-8646fddcadda-kube-api-access-bxwwl\") pod \"packageserver-d55dfcdfc-rlnd7\" (UID: \"332da74f-063b-48aa-8b86-8646fddcadda\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.879710 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjgc7\" (UniqueName: \"kubernetes.io/projected/1412799e-cd89-42f5-895a-76031f913a55-kube-api-access-hjgc7\") pod \"machine-config-server-9rn4f\" (UID: \"1412799e-cd89-42f5-895a-76031f913a55\") " pod="openshift-machine-config-operator/machine-config-server-9rn4f" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.881979 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-427rj"] Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.883079 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" event={"ID":"7ef57cce-ba04-4605-a245-edfad55f6f69","Type":"ContainerStarted","Data":"4a48ce8ef197e28e2e06959a91c5a694ece94ddf6aef0daecf95d4e0096c92cd"} Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.885827 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" event={"ID":"7f9945a5-74c6-482d-b4c4-1097e3efebe0","Type":"ContainerStarted","Data":"904d6a4e4d86f6153e8814ac8ab36c18643e94a180938c00018d02456c8f52f5"} Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.898770 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kzklf" event={"ID":"39b95b69-1fe1-4e7c-9499-ca59220ca7eb","Type":"ContainerStarted","Data":"b1c32b3f3f9a6f8ead8340465c1a08df30d5846f76f8224c786c5d4d4c85a4e2"} Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.898830 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kzklf" event={"ID":"39b95b69-1fe1-4e7c-9499-ca59220ca7eb","Type":"ContainerStarted","Data":"5efe013db0ce50d1b1d24b9ac07794faba7240392a8d0cc56920ee8dc166fc2a"} Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.899736 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.903339 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" event={"ID":"85226c05-ab07-4243-89af-c58b7c3d1f43","Type":"ContainerStarted","Data":"8842ce2138db810d473bfd654d346fe7eefa4bd83570543528fc8359ff74fe0a"} Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.904840 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmlrh\" (UniqueName: \"kubernetes.io/projected/5e154434-b989-4968-be1d-7c9982d34241-kube-api-access-kmlrh\") pod \"kube-storage-version-migrator-operator-b67b599dd-nkxsv\" (UID: \"5e154434-b989-4968-be1d-7c9982d34241\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nkxsv" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.905726 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b492s" event={"ID":"b062963a-a86b-43bb-ba19-c49fa8ac7cf1","Type":"ContainerStarted","Data":"58dc043c08fc9b6b93046a39a70901e1a50b8fa22112d4b5fc5ad84d61ffb5d6"} Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.910292 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gtbds"] Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.911323 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxgxz"] Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.922579 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d9b04397-d728-4bd8-a4ec-55bb6ebde285-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-krjqb\" (UID: \"d9b04397-d728-4bd8-a4ec-55bb6ebde285\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-krjqb" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.923118 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:10 crc kubenswrapper[4906]: E0310 00:09:10.923675 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:11.423659784 +0000 UTC m=+177.571554896 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.926533 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" event={"ID":"f1401406-9964-4c43-8192-efd2f32732e5","Type":"ContainerStarted","Data":"211e161f7327127f540e5f5396e03d0500ce05cd8f1a45027b63f01a41a8511d"} Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.926993 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kfw9l" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.933347 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-krjqb" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.937921 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-647km\" (UniqueName: \"kubernetes.io/projected/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-kube-api-access-647km\") pod \"collect-profiles-29551680-7lx2x\" (UID: \"0492b9b7-f88e-48ee-88e6-83aa55d8ce65\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.938977 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kpmwl" event={"ID":"6bbf8e83-d583-49de-a12c-6f0a3953dc67","Type":"ContainerStarted","Data":"ff64be0c9ae38cd325aab7d4b6db76b05ff43a84c18f862dcbd1a6dd4b604bee"} Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.940069 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-q9zx6"] Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.941856 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nkxsv" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.946870 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ldnm9" event={"ID":"ad986187-a8c3-4e1d-8bff-3f385af901e4","Type":"ContainerStarted","Data":"6de05632539b58434e421bb4011eb6e2f7ec5d0ce88a5dfecca5e50ca5d28dbf"} Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.949028 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.951430 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" event={"ID":"ebf8d4f5-6995-45dd-be49-491ced904443","Type":"ContainerStarted","Data":"1c2fee16554e2c72b5959c960a9110b2cb0dc7a027f9abbcd9c873cc6082a7ff"} Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.954332 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.961677 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mq564" event={"ID":"4b4509ca-5d20-4f5c-89ea-a910f792ff82","Type":"ContainerStarted","Data":"1b21cc89a8de7a59f495e570312062f232cb09206d98ad3e7b913ac58c3590b6"} Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.961737 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.961751 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mq564" event={"ID":"4b4509ca-5d20-4f5c-89ea-a910f792ff82","Type":"ContainerStarted","Data":"307cad07582259a45f6385c96a34fdcb026d172826af0c76f2444a424de835b0"} Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.962846 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7jhp\" (UniqueName: \"kubernetes.io/projected/a46863f6-02e5-4d35-8b59-216377e41403-kube-api-access-p7jhp\") pod \"marketplace-operator-79b997595-vw9rg\" (UID: \"a46863f6-02e5-4d35-8b59-216377e41403\") " pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.967814 4906 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-mq564 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.967864 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-mq564" podUID="4b4509ca-5d20-4f5c-89ea-a910f792ff82" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.969093 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.977559 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21590dce-fca3-4e23-8fd7-ade8be24206c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgvms\" (UID: \"21590dce-fca3-4e23-8fd7-ade8be24206c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgvms" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.977912 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p" Mar 10 00:09:10 crc kubenswrapper[4906]: I0310 00:09:10.986282 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fvmnw" Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.003941 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" event={"ID":"6eeaccaa-9989-4a5c-91ed-67dfb7319e14","Type":"ContainerStarted","Data":"b384e27559a547bca67a8463fe608a6dd0463525229399d5baf03d1ec5a522d0"} Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.013341 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xzvmk" Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.014305 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nbzf\" (UniqueName: \"kubernetes.io/projected/094c6270-b610-42c0-a6ce-3c146cb6bb6c-kube-api-access-2nbzf\") pod \"auto-csr-approver-29551688-fkkqj\" (UID: \"094c6270-b610-42c0-a6ce-3c146cb6bb6c\") " pod="openshift-infra/auto-csr-approver-29551688-fkkqj" Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.021541 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9rn4f" Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.025345 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:11 crc kubenswrapper[4906]: E0310 00:09:11.026794 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:11.526778949 +0000 UTC m=+177.674674061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.032321 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b4rmk" Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.036555 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" event={"ID":"e2a24c21-ab84-4989-a154-b8c9118c31bf","Type":"ContainerStarted","Data":"f2051563043aec9f33ea572ce4e6d8b1022d1cf1276a926380e20afe3a64abfe"} Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.036604 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" event={"ID":"e2a24c21-ab84-4989-a154-b8c9118c31bf","Type":"ContainerStarted","Data":"0a26fb60f00dbc8a0743357d738b2160f6000ba68f8419eb5678279c8bfd30c2"} Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.036617 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" event={"ID":"e2a24c21-ab84-4989-a154-b8c9118c31bf","Type":"ContainerStarted","Data":"3c9a83fb2641b0a9acb28268f4b72e5684d1e04102c46abcb65f75ce801cecab"} Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.040683 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-x7t4l" event={"ID":"c9f070bb-41a2-458e-9818-063ffb52008a","Type":"ContainerStarted","Data":"e06d8673e0ff0c5af23ee8313d97cf8e88534aa789d9d5003014d3c954f38353"} Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.040732 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-x7t4l" event={"ID":"c9f070bb-41a2-458e-9818-063ffb52008a","Type":"ContainerStarted","Data":"107835447dbc55535d2d25472a2c2515cd138e2ea2316742f8704594e6f594c2"} Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.042970 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-x7t4l" Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.044170 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6599z\" (UniqueName: \"kubernetes.io/projected/d49db0cb-de7e-4194-9f6e-8b58cf5f98fb-kube-api-access-6599z\") pod \"csi-hostpathplugin-x6g9x\" (UID: \"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb\") " pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.044185 4906 patch_prober.go:28] interesting pod/console-operator-58897d9998-x7t4l container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.044291 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-x7t4l" podUID="c9f070bb-41a2-458e-9818-063ffb52008a" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 10 00:09:11 crc kubenswrapper[4906]: W0310 00:09:11.053285 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39ebc592_086a_43e6_87d6_c2d67607d511.slice/crio-632ce4b9610079a02cc3820d9ad5d40be7d6f4d4c18cd1fd36f48d1c2b318b48 WatchSource:0}: Error finding container 632ce4b9610079a02cc3820d9ad5d40be7d6f4d4c18cd1fd36f48d1c2b318b48: Status 404 returned error can't find the container with id 632ce4b9610079a02cc3820d9ad5d40be7d6f4d4c18cd1fd36f48d1c2b318b48 Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.055737 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxqgz\" (UniqueName: \"kubernetes.io/projected/220fec7e-5622-4a55-ab08-359de5d87c6f-kube-api-access-qxqgz\") pod \"dns-default-j4vpr\" (UID: \"220fec7e-5622-4a55-ab08-359de5d87c6f\") " pod="openshift-dns/dns-default-j4vpr" Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.129225 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:11 crc kubenswrapper[4906]: E0310 00:09:11.129494 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:11.629463843 +0000 UTC m=+177.777358955 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.131182 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:11 crc kubenswrapper[4906]: E0310 00:09:11.139652 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:11.63961534 +0000 UTC m=+177.787510442 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.170549 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zkqrr"] Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.210007 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.239435 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:11 crc kubenswrapper[4906]: E0310 00:09:11.239930 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:11.739913286 +0000 UTC m=+177.887808398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.261719 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgvms" Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.282800 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qnppg"] Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.287942 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29551680-hclh4"] Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.305606 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551688-fkkqj" Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.307571 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6"] Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.309660 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ltqv2"] Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.340199 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.341188 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:11 crc kubenswrapper[4906]: E0310 00:09:11.341594 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:11.84158175 +0000 UTC m=+177.989476862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.352934 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j4vpr" Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.406564 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qfc95"] Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.444198 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:11 crc kubenswrapper[4906]: E0310 00:09:11.444578 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:11.944550412 +0000 UTC m=+178.092445524 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.444697 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:11 crc kubenswrapper[4906]: E0310 00:09:11.445229 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:11.945222211 +0000 UTC m=+178.093117323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.450773 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nkxsv"] Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.452612 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb"] Mar 10 00:09:11 crc kubenswrapper[4906]: W0310 00:09:11.470480 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2279b4a5_d4f2_455a_87e2_0774d6d57c40.slice/crio-2de4e5270dde8b846ff69da7effdfac19a278149b5cc750828ca18e0d729ab60 WatchSource:0}: Error finding container 2de4e5270dde8b846ff69da7effdfac19a278149b5cc750828ca18e0d729ab60: Status 404 returned error can't find the container with id 2de4e5270dde8b846ff69da7effdfac19a278149b5cc750828ca18e0d729ab60 Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.533086 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx"] Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.546117 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:11 crc kubenswrapper[4906]: E0310 00:09:11.546829 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:12.046809033 +0000 UTC m=+178.194704145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.629527 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b492s" podStartSLOduration=123.629500901 podStartE2EDuration="2m3.629500901s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:11.593992547 +0000 UTC m=+177.741887659" watchObservedRunningTime="2026-03-10 00:09:11.629500901 +0000 UTC m=+177.777396013" Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.649716 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:11 crc kubenswrapper[4906]: E0310 00:09:11.650047 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:12.150035332 +0000 UTC m=+178.297930444 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:11 crc kubenswrapper[4906]: W0310 00:09:11.672732 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff15a729_bbdf_4422_8db2_0e429e76ee25.slice/crio-c6de9de034f621db84d6f070fbc7686ae7b80b2f5f8af5a7656f02fcd8be3847 WatchSource:0}: Error finding container c6de9de034f621db84d6f070fbc7686ae7b80b2f5f8af5a7656f02fcd8be3847: Status 404 returned error can't find the container with id c6de9de034f621db84d6f070fbc7686ae7b80b2f5f8af5a7656f02fcd8be3847 Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.750421 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:11 crc kubenswrapper[4906]: E0310 00:09:11.751423 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:12.251403008 +0000 UTC m=+178.399298110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.853121 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-lrj5w" podStartSLOduration=122.853097803 podStartE2EDuration="2m2.853097803s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:11.832170662 +0000 UTC m=+177.980065774" watchObservedRunningTime="2026-03-10 00:09:11.853097803 +0000 UTC m=+178.000992915" Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.853763 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kfw9l"] Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.858337 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:11 crc kubenswrapper[4906]: E0310 00:09:11.858883 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:12.358862606 +0000 UTC m=+178.506757718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.882148 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2"] Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.886744 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7"] Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.915071 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-x7t4l" podStartSLOduration=123.91503470399999 podStartE2EDuration="2m3.915034704s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:11.908273203 +0000 UTC m=+178.056168325" watchObservedRunningTime="2026-03-10 00:09:11.915034704 +0000 UTC m=+178.062929826" Mar 10 00:09:11 crc kubenswrapper[4906]: W0310 00:09:11.949052 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd366e911_15d3_4899_b843_fd66f0a9416b.slice/crio-3dacbec73f6fe808a95b26ea2f418e76e3718ce1a0c419a5edb771e62f1b4968 WatchSource:0}: Error finding container 3dacbec73f6fe808a95b26ea2f418e76e3718ce1a0c419a5edb771e62f1b4968: Status 404 returned error can't find the container with id 3dacbec73f6fe808a95b26ea2f418e76e3718ce1a0c419a5edb771e62f1b4968 Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.959237 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:11 crc kubenswrapper[4906]: E0310 00:09:11.959970 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:12.459937724 +0000 UTC m=+178.607832826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:11 crc kubenswrapper[4906]: I0310 00:09:11.960097 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:11 crc kubenswrapper[4906]: E0310 00:09:11.960559 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:12.460551931 +0000 UTC m=+178.608447043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.009124 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-krjqb"] Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.041650 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-mq564" podStartSLOduration=124.041611953 podStartE2EDuration="2m4.041611953s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:12.040905323 +0000 UTC m=+178.188800435" watchObservedRunningTime="2026-03-10 00:09:12.041611953 +0000 UTC m=+178.189507065" Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.057040 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gtbds" event={"ID":"68de0039-27ee-4f02-987f-9105c69c355f","Type":"ContainerStarted","Data":"5fd0db5d0584cac4de357285abcc81ec12538079f8b7897529bd30fcb64e412b"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.062491 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:12 crc kubenswrapper[4906]: E0310 00:09:12.063464 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:12.56342178 +0000 UTC m=+178.711316882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.069133 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nkxsv" event={"ID":"5e154434-b989-4968-be1d-7c9982d34241","Type":"ContainerStarted","Data":"8fb1bd63a8d55cae820f7a92a6880fc0217b533082860b97dbb1f48e3eed3a39"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.078373 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ldnm9" event={"ID":"ad986187-a8c3-4e1d-8bff-3f385af901e4","Type":"ContainerStarted","Data":"27b129d50c460173ce55989ef5aa4ba3a21c776745a8b124014152c111f3f5dd"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.078427 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ldnm9" event={"ID":"ad986187-a8c3-4e1d-8bff-3f385af901e4","Type":"ContainerStarted","Data":"24a6cd28c16dbe559b0fa01959c9b6e45045be8131a33ef83ebcd1eb7d1b460f"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.079292 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ldnm9" Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.080672 4906 generic.go:334] "Generic (PLEG): container finished" podID="7f9945a5-74c6-482d-b4c4-1097e3efebe0" containerID="caa8994e898f6e360a7f1cadde090b61bbf7e84d505983bc9d6dca14c3689818" exitCode=0 Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.080728 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" event={"ID":"7f9945a5-74c6-482d-b4c4-1097e3efebe0","Type":"ContainerDied","Data":"caa8994e898f6e360a7f1cadde090b61bbf7e84d505983bc9d6dca14c3689818"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.087251 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" event={"ID":"332da74f-063b-48aa-8b86-8646fddcadda","Type":"ContainerStarted","Data":"f7f4a66601a364c05fba11da9fcd600735775b6d698f6f9beb128346d6b083f2"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.089751 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qnppg" event={"ID":"cff04257-cb6e-44cf-93de-f4e2cdf6698d","Type":"ContainerStarted","Data":"0b3cfcaeb416f6cd7cd58665995b4136e3a20a0e12052eca8b2813ada6a952eb"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.091820 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m" event={"ID":"e5131de2-27e3-4ec2-8a59-7cc11f6865be","Type":"ContainerStarted","Data":"32c9808ec924f6cdb03ad63a93e8ad6984c0254d5e85b2f528d0ca064199e60d"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.095031 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb" event={"ID":"ff15a729-bbdf-4422-8db2-0e429e76ee25","Type":"ContainerStarted","Data":"c6de9de034f621db84d6f070fbc7686ae7b80b2f5f8af5a7656f02fcd8be3847"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.099186 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" event={"ID":"6b23a6fe-6dcc-4d84-8591-31079d563929","Type":"ContainerStarted","Data":"dee1efff17933b90a8c5e6d2a63f17526c1908206241b43c3d6095a8cf0a58a3"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.110018 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" event={"ID":"dc8ebb5d-19af-4da1-8767-4b901d1fef8a","Type":"ContainerStarted","Data":"77d5c345fa94b84e736485541c0e9514ca5531db88fb9c9a586bcb360990c4e7"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.164533 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:12 crc kubenswrapper[4906]: E0310 00:09:12.165012 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:12.664997882 +0000 UTC m=+178.812892994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.175565 4906 generic.go:334] "Generic (PLEG): container finished" podID="7ef57cce-ba04-4605-a245-edfad55f6f69" containerID="2b37c212025a2317caa9a0df88cc1e0d6ded172ffd5751f675b2390ea9978cbf" exitCode=0 Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.175867 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" event={"ID":"7ef57cce-ba04-4605-a245-edfad55f6f69","Type":"ContainerDied","Data":"2b37c212025a2317caa9a0df88cc1e0d6ded172ffd5751f675b2390ea9978cbf"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.184580 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-427rj" event={"ID":"de2ea5bf-12b8-4ab0-a073-53df41c9646a","Type":"ContainerStarted","Data":"99ef9e0350a7492caaa77b25a704192b546587a9d0539b1212d918d50645fa8f"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.189565 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x"] Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.190426 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b4rmk"] Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.197445 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2" event={"ID":"06401874-a064-42cf-b1db-0e5bff007b1c","Type":"ContainerStarted","Data":"16ab65a1d7cbe5544fafb2f3a55615f5aac758565b33c1085216b905b51864d4"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.199892 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p"] Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.201362 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fvmnw"] Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.216593 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q9zx6" event={"ID":"39ebc592-086a-43e6-87d6-c2d67607d511","Type":"ContainerStarted","Data":"632ce4b9610079a02cc3820d9ad5d40be7d6f4d4c18cd1fd36f48d1c2b318b48"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.266300 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:12 crc kubenswrapper[4906]: E0310 00:09:12.266940 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:12.766908823 +0000 UTC m=+178.914803925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.267091 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:12 crc kubenswrapper[4906]: E0310 00:09:12.267752 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:12.767728357 +0000 UTC m=+178.915623469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.272946 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" event={"ID":"85226c05-ab07-4243-89af-c58b7c3d1f43","Type":"ContainerStarted","Data":"a18807da7208beee65e915c4c8e80b88ace1d81d39610d4f7625ca5af530b46f"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.273962 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.277062 4906 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-v25gg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.277111 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" podUID="85226c05-ab07-4243-89af-c58b7c3d1f43" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.282909 4906 generic.go:334] "Generic (PLEG): container finished" podID="f1401406-9964-4c43-8192-efd2f32732e5" containerID="82e0b7ff6c80238b343bdb836d840322bd79fe040398c89f49d30d7959a9962d" exitCode=0 Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.283256 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" event={"ID":"f1401406-9964-4c43-8192-efd2f32732e5","Type":"ContainerDied","Data":"82e0b7ff6c80238b343bdb836d840322bd79fe040398c89f49d30d7959a9962d"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.289200 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" event={"ID":"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4","Type":"ContainerStarted","Data":"939270c249d87a21da7e2a85e2a114bc3f3ee55c5faad11bd799f35e120ef0e9"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.293114 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" event={"ID":"2279b4a5-d4f2-455a-87e2-0774d6d57c40","Type":"ContainerStarted","Data":"2de4e5270dde8b846ff69da7effdfac19a278149b5cc750828ca18e0d729ab60"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.294617 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xzvmk"] Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.295605 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zkqrr" event={"ID":"1c418134-7dc2-42f5-b1ce-c22a2b77a4b9","Type":"ContainerStarted","Data":"93a654dbdaa2fa1ea02d45c8cd5c69e3eff76aef1fc003aaebb6e855e1a7ba76"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.303595 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xd62p" event={"ID":"66a5f13c-92bf-4e0b-ac5e-f14de7c6f72a","Type":"ContainerStarted","Data":"3663b5382955d8163e6c81ae63637322138dc8f8fad57473a344288fa9abf1e5"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.303650 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xd62p" event={"ID":"66a5f13c-92bf-4e0b-ac5e-f14de7c6f72a","Type":"ContainerStarted","Data":"8018b2e3b22bf5098ff1ccf97499690f50288332093f3a75c1d431fa85ce8f14"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.305128 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b492s" event={"ID":"b062963a-a86b-43bb-ba19-c49fa8ac7cf1","Type":"ContainerStarted","Data":"01fc4901ef9dae16df583a02465347fba43f4c26a8a9d4585d0b3c8927425497"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.306925 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kpmwl" event={"ID":"6bbf8e83-d583-49de-a12c-6f0a3953dc67","Type":"ContainerStarted","Data":"f28620369a2cb8e2bf6459c2b28099d0d5bac6090ec4608aa2eddafbff84eed9"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.314373 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9rn4f" event={"ID":"1412799e-cd89-42f5-895a-76031f913a55","Type":"ContainerStarted","Data":"d4ab714830a2a335ee5543771dd13eb141f9f20d8f895bdc346f6ca878f0ba33"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.340862 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kzklf" event={"ID":"39b95b69-1fe1-4e7c-9499-ca59220ca7eb","Type":"ContainerStarted","Data":"6dffaaa3414f2d7a143aa41e5ec33968e10fa2dfe5f0b55ad34ade1785c628f7"} Mar 10 00:09:12 crc kubenswrapper[4906]: W0310 00:09:12.342022 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0492b9b7_f88e_48ee_88e6_83aa55d8ce65.slice/crio-482af40c76464fdb90b58eb971231d8b8032366a2a8f9f9fdc35f6db012db761 WatchSource:0}: Error finding container 482af40c76464fdb90b58eb971231d8b8032366a2a8f9f9fdc35f6db012db761: Status 404 returned error can't find the container with id 482af40c76464fdb90b58eb971231d8b8032366a2a8f9f9fdc35f6db012db761 Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.351190 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29551680-hclh4" event={"ID":"d91d282e-a4f3-4bc9-9623-4640774f641a","Type":"ContainerStarted","Data":"8c633a50db1c44e8b86d17ea03a08cf6ee7642ae5e1d973cfc5f07ba0c58702b"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.354273 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" event={"ID":"ebf8d4f5-6995-45dd-be49-491ced904443","Type":"ContainerStarted","Data":"513f92dd87b50099db2ffba140f2dc61b7bc0eb88993d0065419e3038be1bf08"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.355805 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.369210 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:12 crc kubenswrapper[4906]: E0310 00:09:12.370269 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:12.870235075 +0000 UTC m=+179.018130187 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.376107 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxgxz" event={"ID":"376340e6-76cd-476b-b86b-350cb9edfecf","Type":"ContainerStarted","Data":"80ec3ec598ba7577f73fdacb360e7f0fdf3f8b6593f15c5ac7ed313c1a5e8ce4"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.379377 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" event={"ID":"6eeaccaa-9989-4a5c-91ed-67dfb7319e14","Type":"ContainerStarted","Data":"ee9b582abd1fee0d1c9e6d98209f9732b0e24d87046ff705b7ca5bcd7870e9c7"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.385473 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kfw9l" event={"ID":"d366e911-15d3-4899-b843-fd66f0a9416b","Type":"ContainerStarted","Data":"3dacbec73f6fe808a95b26ea2f418e76e3718ce1a0c419a5edb771e62f1b4968"} Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.395982 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551688-fkkqj"] Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.401268 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.445404 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vw9rg"] Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.479946 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:12 crc kubenswrapper[4906]: E0310 00:09:12.484704 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:12.984689411 +0000 UTC m=+179.132584523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.537778 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgvms"] Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.538221 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-x6g9x"] Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.572387 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.577468 4906 patch_prober.go:28] interesting pod/router-default-5444994796-kpmwl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:12 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Mar 10 00:09:12 crc kubenswrapper[4906]: [+]process-running ok Mar 10 00:09:12 crc kubenswrapper[4906]: healthz check failed Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.577538 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kpmwl" podUID="6bbf8e83-d583-49de-a12c-6f0a3953dc67" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.583127 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:12 crc kubenswrapper[4906]: E0310 00:09:12.588259 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:13.088205048 +0000 UTC m=+179.236100160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.615355 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-x7t4l" Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.619039 4906 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:09:12 crc kubenswrapper[4906]: W0310 00:09:12.643034 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21590dce_fca3_4e23_8fd7_ade8be24206c.slice/crio-ae6e3a6828f5c724dd6a45e601d914c8516e282e0eccd35d2bc3aaa38eedf713 WatchSource:0}: Error finding container ae6e3a6828f5c724dd6a45e601d914c8516e282e0eccd35d2bc3aaa38eedf713: Status 404 returned error can't find the container with id ae6e3a6828f5c724dd6a45e601d914c8516e282e0eccd35d2bc3aaa38eedf713 Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.685594 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:12 crc kubenswrapper[4906]: E0310 00:09:12.686258 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:13.18623914 +0000 UTC m=+179.334134252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.727971 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" podStartSLOduration=124.727951059 podStartE2EDuration="2m4.727951059s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:12.712899754 +0000 UTC m=+178.860794856" watchObservedRunningTime="2026-03-10 00:09:12.727951059 +0000 UTC m=+178.875846171" Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.729446 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j4vpr"] Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.766566 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.817677 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:12 crc kubenswrapper[4906]: E0310 00:09:12.818241 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:13.318222811 +0000 UTC m=+179.466117923 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:12 crc kubenswrapper[4906]: I0310 00:09:12.921851 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:12 crc kubenswrapper[4906]: E0310 00:09:12.922253 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:13.422238852 +0000 UTC m=+179.570133964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.002278 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxgxz" podStartSLOduration=125.002251935 podStartE2EDuration="2m5.002251935s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:12.995062962 +0000 UTC m=+179.142958074" watchObservedRunningTime="2026-03-10 00:09:13.002251935 +0000 UTC m=+179.150147047" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.004935 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kzklf" podStartSLOduration=125.00492711 podStartE2EDuration="2m5.00492711s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:12.954386881 +0000 UTC m=+179.102281993" watchObservedRunningTime="2026-03-10 00:09:13.00492711 +0000 UTC m=+179.152822222" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.025662 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:13 crc kubenswrapper[4906]: E0310 00:09:13.026103 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:13.526083069 +0000 UTC m=+179.673978181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.083194 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-kpmwl" podStartSLOduration=124.083168393 podStartE2EDuration="2m4.083168393s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:13.056129168 +0000 UTC m=+179.204024280" watchObservedRunningTime="2026-03-10 00:09:13.083168393 +0000 UTC m=+179.231063505" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.127883 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:13 crc kubenswrapper[4906]: E0310 00:09:13.128306 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:13.628292949 +0000 UTC m=+179.776188061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.146516 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ldnm9" podStartSLOduration=124.146488123 podStartE2EDuration="2m4.146488123s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:13.130094999 +0000 UTC m=+179.277990111" watchObservedRunningTime="2026-03-10 00:09:13.146488123 +0000 UTC m=+179.294383235" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.193759 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" podStartSLOduration=124.193739859 podStartE2EDuration="2m4.193739859s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:13.192057451 +0000 UTC m=+179.339952573" watchObservedRunningTime="2026-03-10 00:09:13.193739859 +0000 UTC m=+179.341634971" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.232327 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:13 crc kubenswrapper[4906]: E0310 00:09:13.232755 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:13.732734712 +0000 UTC m=+179.880629824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.262480 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xd62p" podStartSLOduration=124.262454352 podStartE2EDuration="2m4.262454352s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:13.254993721 +0000 UTC m=+179.402888823" watchObservedRunningTime="2026-03-10 00:09:13.262454352 +0000 UTC m=+179.410349464" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.334191 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:13 crc kubenswrapper[4906]: E0310 00:09:13.334775 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:13.834753186 +0000 UTC m=+179.982648298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.435608 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:13 crc kubenswrapper[4906]: E0310 00:09:13.436111 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:13.936076131 +0000 UTC m=+180.083971233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.441749 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xzvmk" event={"ID":"cd304b2a-9e83-4f93-9f48-c88f6ab93b16","Type":"ContainerStarted","Data":"e981ea591f54521dc3be37953e1feca82ea28b78616a757535b82f6231be38e2"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.445405 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" event={"ID":"2279b4a5-d4f2-455a-87e2-0774d6d57c40","Type":"ContainerStarted","Data":"e86157127fbaef46e13643fb3bf09b6ba68401287206ab27a7e4da86558c1b14"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.459150 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" event={"ID":"6eeaccaa-9989-4a5c-91ed-67dfb7319e14","Type":"ContainerStarted","Data":"f1a54d8f94586d9d333a1b3603231bd4e3a26f533fd6362701a3173731c39aaa"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.474562 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nkxsv" event={"ID":"5e154434-b989-4968-be1d-7c9982d34241","Type":"ContainerStarted","Data":"9405fac25a07a05adc59719050d90b727bad58c192d957f4df0c30ad61c8634b"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.484974 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q9zx6" event={"ID":"39ebc592-086a-43e6-87d6-c2d67607d511","Type":"ContainerStarted","Data":"59a4fe4bc39862981655f7f270351007504c3057f9d18f3d59907cbf2f44c4ed"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.485435 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-ltqv2" podStartSLOduration=125.485417916 podStartE2EDuration="2m5.485417916s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:13.483314907 +0000 UTC m=+179.631210019" watchObservedRunningTime="2026-03-10 00:09:13.485417916 +0000 UTC m=+179.633313028" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.511137 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9rn4f" event={"ID":"1412799e-cd89-42f5-895a-76031f913a55","Type":"ContainerStarted","Data":"3bbdb27a9d692d6b9aa1179b84890adaa4d745df5481914a2d941c6eae28d175"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.526362 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nkxsv" podStartSLOduration=124.526315952 podStartE2EDuration="2m4.526315952s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:13.524934473 +0000 UTC m=+179.672829595" watchObservedRunningTime="2026-03-10 00:09:13.526315952 +0000 UTC m=+179.674211074" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.527882 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" event={"ID":"332da74f-063b-48aa-8b86-8646fddcadda","Type":"ContainerStarted","Data":"a5ee744aaa348782aeefd365bc6137a518d8773ee1d530287d39881a90050a6f"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.529600 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.535771 4906 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rlnd7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" start-of-body= Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.535828 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" podUID="332da74f-063b-48aa-8b86-8646fddcadda" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.539173 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:13 crc kubenswrapper[4906]: E0310 00:09:13.546448 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:14.046427171 +0000 UTC m=+180.194322283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.558620 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gtbds" event={"ID":"68de0039-27ee-4f02-987f-9105c69c355f","Type":"ContainerStarted","Data":"3722051313eeccf0febb92584e60c0d58c761eb05f503f6caf5ed352a1dadb6d"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.589118 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m" event={"ID":"e5131de2-27e3-4ec2-8a59-7cc11f6865be","Type":"ContainerStarted","Data":"1160eead29808ce2fdb22dc60467ebe119dc2b4b781ef64a2752e9f53455f707"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.590495 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.600738 4906 patch_prober.go:28] interesting pod/router-default-5444994796-kpmwl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:13 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Mar 10 00:09:13 crc kubenswrapper[4906]: [+]process-running ok Mar 10 00:09:13 crc kubenswrapper[4906]: healthz check failed Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.600790 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kpmwl" podUID="6bbf8e83-d583-49de-a12c-6f0a3953dc67" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.603759 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-q9zx6" podStartSLOduration=125.603729391 podStartE2EDuration="2m5.603729391s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:13.570590364 +0000 UTC m=+179.718485476" watchObservedRunningTime="2026-03-10 00:09:13.603729391 +0000 UTC m=+179.751624503" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.604160 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9rn4f" podStartSLOduration=6.604154083 podStartE2EDuration="6.604154083s" podCreationTimestamp="2026-03-10 00:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:13.595090707 +0000 UTC m=+179.742985819" watchObservedRunningTime="2026-03-10 00:09:13.604154083 +0000 UTC m=+179.752049195" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.625303 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zkqrr" event={"ID":"1c418134-7dc2-42f5-b1ce-c22a2b77a4b9","Type":"ContainerStarted","Data":"4983cb8bc2e7a88dba076b5e73da320b07a7db8f50141e835102d85f50d8c7d1"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.626431 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-zkqrr" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.631601 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.633173 4906 patch_prober.go:28] interesting pod/downloads-7954f5f757-zkqrr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.633428 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zkqrr" podUID="1c418134-7dc2-42f5-b1ce-c22a2b77a4b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.633488 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fvmnw" event={"ID":"851d4efd-45ba-4f03-bfa6-75bde8aae278","Type":"ContainerStarted","Data":"a5a202c09bdddc9ed5bb4ab5ad93a1b3b7cbbd0ce2a1bec496a077addfd5b8f3"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.640071 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:13 crc kubenswrapper[4906]: E0310 00:09:13.643658 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:14.143602288 +0000 UTC m=+180.291497400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.665208 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gcx4m" podStartSLOduration=124.665185578 podStartE2EDuration="2m4.665185578s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:13.626023452 +0000 UTC m=+179.773918564" watchObservedRunningTime="2026-03-10 00:09:13.665185578 +0000 UTC m=+179.813080680" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.670969 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxgxz" event={"ID":"376340e6-76cd-476b-b86b-350cb9edfecf","Type":"ContainerStarted","Data":"e09c5bd978e8776b953a176b452c78e9d6fad4c585a064001005d8529dc54d56"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.672659 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" event={"ID":"a46863f6-02e5-4d35-8b59-216377e41403","Type":"ContainerStarted","Data":"2a1ca1e77aa51ff73f567a9ff25f601106a3f2e3f7ca949b5e359c1ad6a71ba6"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.688020 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" podStartSLOduration=124.687998563 podStartE2EDuration="2m4.687998563s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:13.665031603 +0000 UTC m=+179.812926715" watchObservedRunningTime="2026-03-10 00:09:13.687998563 +0000 UTC m=+179.835893665" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.693513 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29551680-hclh4" event={"ID":"d91d282e-a4f3-4bc9-9623-4640774f641a","Type":"ContainerStarted","Data":"7e7178444a2023cfc4df700ef8b68a751212644659aac35229bd49430c1b1c77"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.697211 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-427rj" event={"ID":"de2ea5bf-12b8-4ab0-a073-53df41c9646a","Type":"ContainerStarted","Data":"7725b0664c493c5bfb655f7092df904d09575719dceb9f1a206832df06c829a4"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.706160 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j4vpr" event={"ID":"220fec7e-5622-4a55-ab08-359de5d87c6f","Type":"ContainerStarted","Data":"bffb3cc5beea3c462bf84248d0a990dced2e741c69531bac939adfdb58e5725c"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.729278 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qnppg" event={"ID":"cff04257-cb6e-44cf-93de-f4e2cdf6698d","Type":"ContainerStarted","Data":"bc3841c38fec4a0af54ba7df15e161ab4a8d4f94420122af57b0226a7ed6a695"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.742417 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-zkqrr" podStartSLOduration=125.742395451 podStartE2EDuration="2m5.742395451s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:13.742325439 +0000 UTC m=+179.890220551" watchObservedRunningTime="2026-03-10 00:09:13.742395451 +0000 UTC m=+179.890290563" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.743659 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:13 crc kubenswrapper[4906]: E0310 00:09:13.746432 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:14.246414055 +0000 UTC m=+180.394309157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.769895 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb" event={"ID":"ff15a729-bbdf-4422-8db2-0e429e76ee25","Type":"ContainerStarted","Data":"7731187f6a2cec4deb01b467658d88569629c5b669976f9e7001501cc95b4609"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.782612 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551688-fkkqj" event={"ID":"094c6270-b610-42c0-a6ce-3c146cb6bb6c","Type":"ContainerStarted","Data":"702b2e5d0f0b233b4c7fb5bb04fc8b9918a7e64d152cad9375aa67adaa5802a1"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.823497 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29551680-hclh4" podStartSLOduration=125.819043488 podStartE2EDuration="2m5.819043488s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:13.786136468 +0000 UTC m=+179.934031580" watchObservedRunningTime="2026-03-10 00:09:13.819043488 +0000 UTC m=+179.966938600" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.825805 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-427rj" podStartSLOduration=124.825792239 podStartE2EDuration="2m4.825792239s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:13.816522967 +0000 UTC m=+179.964418079" watchObservedRunningTime="2026-03-10 00:09:13.825792239 +0000 UTC m=+179.973687351" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.844448 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:13 crc kubenswrapper[4906]: E0310 00:09:13.845917 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:14.345896347 +0000 UTC m=+180.493791459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.846542 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" event={"ID":"7f9945a5-74c6-482d-b4c4-1097e3efebe0","Type":"ContainerStarted","Data":"dd6236c392b4dfac5393511fa74a32bdc1e97f93aaa4af05410abd48e30e5ad8"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.846623 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.854732 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" event={"ID":"0492b9b7-f88e-48ee-88e6-83aa55d8ce65","Type":"ContainerStarted","Data":"482af40c76464fdb90b58eb971231d8b8032366a2a8f9f9fdc35f6db012db761"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.894960 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b4rmk" event={"ID":"380fb2b7-9fcf-4c1a-879a-a4a6195f4e58","Type":"ContainerStarted","Data":"8c63d357b769611a80df5aacafdb563800581de8e9f51f2496eb866b3db70e59"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.901991 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-2g9zb" podStartSLOduration=124.901965133 podStartE2EDuration="2m4.901965133s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:13.857490975 +0000 UTC m=+180.005386087" watchObservedRunningTime="2026-03-10 00:09:13.901965133 +0000 UTC m=+180.049860245" Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.921591 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p" event={"ID":"793885bc-abbf-44be-8e32-b2088cf449bd","Type":"ContainerStarted","Data":"80bfcbec92b9d2880c1d3a3833e4e8067b113855f90f7c0e972a89181d9b56ce"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.940271 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" event={"ID":"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb","Type":"ContainerStarted","Data":"c8529fa9f7e8aa442997d164cbed47ae810d122173c0e4f2e8a9c5c8a8b8a4a9"} Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.946769 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:13 crc kubenswrapper[4906]: E0310 00:09:13.954823 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:14.454801777 +0000 UTC m=+180.602696889 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:13 crc kubenswrapper[4906]: I0310 00:09:13.962685 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" podStartSLOduration=124.962657709 podStartE2EDuration="2m4.962657709s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:13.908122187 +0000 UTC m=+180.056017309" watchObservedRunningTime="2026-03-10 00:09:13.962657709 +0000 UTC m=+180.110552821" Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.012512 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-krjqb" event={"ID":"d9b04397-d728-4bd8-a4ec-55bb6ebde285","Type":"ContainerStarted","Data":"fe621c4cec866a66e8ba4899d1f495181ab2015f1528756499493624c5e0842f"} Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.026135 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgvms" event={"ID":"21590dce-fca3-4e23-8fd7-ade8be24206c","Type":"ContainerStarted","Data":"ae6e3a6828f5c724dd6a45e601d914c8516e282e0eccd35d2bc3aaa38eedf713"} Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.043389 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" podStartSLOduration=126.043364741 podStartE2EDuration="2m6.043364741s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:13.973741512 +0000 UTC m=+180.121636624" watchObservedRunningTime="2026-03-10 00:09:14.043364741 +0000 UTC m=+180.191259853" Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.044248 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-krjqb" podStartSLOduration=125.044242575 podStartE2EDuration="2m5.044242575s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:14.038801292 +0000 UTC m=+180.186696404" watchObservedRunningTime="2026-03-10 00:09:14.044242575 +0000 UTC m=+180.192137687" Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.051376 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:14 crc kubenswrapper[4906]: E0310 00:09:14.052120 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:14.552046846 +0000 UTC m=+180.699941958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.112061 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" event={"ID":"6b23a6fe-6dcc-4d84-8591-31079d563929","Type":"ContainerStarted","Data":"f13097e6d1f1339f4bab0e8077eb091f78348f357edcd6975c45187ff92c4657"} Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.137071 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" event={"ID":"dc8ebb5d-19af-4da1-8767-4b901d1fef8a","Type":"ContainerStarted","Data":"e197694c1cbdd9b37b4b57debb3164c733b456c01dd71eae8829cf7b4a6d00d9"} Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.148793 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.153331 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:14 crc kubenswrapper[4906]: E0310 00:09:14.154037 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:14.654023479 +0000 UTC m=+180.801918591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.213046 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" podStartSLOduration=125.213012087 podStartE2EDuration="2m5.213012087s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:14.182652449 +0000 UTC m=+180.330547561" watchObservedRunningTime="2026-03-10 00:09:14.213012087 +0000 UTC m=+180.360907199" Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.254624 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:14 crc kubenswrapper[4906]: E0310 00:09:14.256424 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:14.756405814 +0000 UTC m=+180.904300926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.362696 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:14 crc kubenswrapper[4906]: E0310 00:09:14.364332 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:14.864315225 +0000 UTC m=+181.012210337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.468233 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:14 crc kubenswrapper[4906]: E0310 00:09:14.468694 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:14.968677396 +0000 UTC m=+181.116572508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.586522 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:14 crc kubenswrapper[4906]: E0310 00:09:14.588835 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:15.088810963 +0000 UTC m=+181.236706075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.613885 4906 patch_prober.go:28] interesting pod/router-default-5444994796-kpmwl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:14 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Mar 10 00:09:14 crc kubenswrapper[4906]: [+]process-running ok Mar 10 00:09:14 crc kubenswrapper[4906]: healthz check failed Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.614377 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kpmwl" podUID="6bbf8e83-d583-49de-a12c-6f0a3953dc67" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.697853 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:14 crc kubenswrapper[4906]: E0310 00:09:14.698351 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:15.198329989 +0000 UTC m=+181.346225101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.802223 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:14 crc kubenswrapper[4906]: E0310 00:09:14.802695 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:15.30268054 +0000 UTC m=+181.450575652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:14 crc kubenswrapper[4906]: I0310 00:09:14.904297 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:14 crc kubenswrapper[4906]: E0310 00:09:14.904811 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:15.404790637 +0000 UTC m=+181.552685749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.012477 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:15 crc kubenswrapper[4906]: E0310 00:09:15.013273 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:15.513259894 +0000 UTC m=+181.661155006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.117418 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:15 crc kubenswrapper[4906]: E0310 00:09:15.117929 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:15.617894492 +0000 UTC m=+181.765789604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.176880 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p" event={"ID":"793885bc-abbf-44be-8e32-b2088cf449bd","Type":"ContainerStarted","Data":"987af76af7c4bc7fe10b4771b174aa427205e04a0fecbf391f433623905a195f"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.176936 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p" event={"ID":"793885bc-abbf-44be-8e32-b2088cf449bd","Type":"ContainerStarted","Data":"59ee8f05284513e5d4d1817a0cc3cd5445e703057dd445fc1797161c583d023d"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.207947 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fvmnw" event={"ID":"851d4efd-45ba-4f03-bfa6-75bde8aae278","Type":"ContainerStarted","Data":"c0dcaf968a5220dc92678f7d559dcfce778458c6d68cee1a8c92034aec65f8ed"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.220106 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:15 crc kubenswrapper[4906]: E0310 00:09:15.220591 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:15.720572545 +0000 UTC m=+181.868467667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.226994 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2cxm6" event={"ID":"dc8ebb5d-19af-4da1-8767-4b901d1fef8a","Type":"ContainerStarted","Data":"6a567bfbc3461f0ac040d4a382a9805b2e7cc8649ee504e4a0aaf459934f588d"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.280401 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fvmnw" podStartSLOduration=126.280381656 podStartE2EDuration="2m6.280381656s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:15.27378202 +0000 UTC m=+181.421677122" watchObservedRunningTime="2026-03-10 00:09:15.280381656 +0000 UTC m=+181.428276768" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.281774 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mnn4p" podStartSLOduration=126.281766796 podStartE2EDuration="2m6.281766796s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:15.225722081 +0000 UTC m=+181.373617203" watchObservedRunningTime="2026-03-10 00:09:15.281766796 +0000 UTC m=+181.429661908" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.289020 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-krjqb" event={"ID":"d9b04397-d728-4bd8-a4ec-55bb6ebde285","Type":"ContainerStarted","Data":"db5fb8636e837403db35c9a76236df2d58af933539b2d1752e3c86b109a0642d"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.323446 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:15 crc kubenswrapper[4906]: E0310 00:09:15.326331 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:15.826295425 +0000 UTC m=+181.974190707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.335509 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" event={"ID":"a46863f6-02e5-4d35-8b59-216377e41403","Type":"ContainerStarted","Data":"e69f1b587cd2d25069aa13c3a790a609e50976e1cfe2d22eddc8253bb52babc2"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.336320 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.341849 4906 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vw9rg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.342148 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" podUID="a46863f6-02e5-4d35-8b59-216377e41403" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.353893 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kfw9l" event={"ID":"d366e911-15d3-4899-b843-fd66f0a9416b","Type":"ContainerStarted","Data":"bed0466e8a2814ea56a478cc0c5b2cc7146c7f368d08fb6fd407d1e0c6310664"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.353950 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kfw9l" event={"ID":"d366e911-15d3-4899-b843-fd66f0a9416b","Type":"ContainerStarted","Data":"34e3c7d076b36bbe373638305c2fba2138713ff247f80906610746b02c8271e4"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.355796 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xzvmk" event={"ID":"cd304b2a-9e83-4f93-9f48-c88f6ab93b16","Type":"ContainerStarted","Data":"442a28dda6642e9f4a23b6472282feff5bd1ef58b78362112737fb9b10ec82f1"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.387028 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" podStartSLOduration=126.387012811 podStartE2EDuration="2m6.387012811s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:15.386690102 +0000 UTC m=+181.534585214" watchObservedRunningTime="2026-03-10 00:09:15.387012811 +0000 UTC m=+181.534907923" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.399432 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t65x9"] Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.400434 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t65x9" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.404036 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.407472 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gtbds" event={"ID":"68de0039-27ee-4f02-987f-9105c69c355f","Type":"ContainerStarted","Data":"bcf889ebe022cf9463f90bf38c44691b38bd85e42955578da29e9c826c955551"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.427026 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:15 crc kubenswrapper[4906]: E0310 00:09:15.427360 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:15.927347412 +0000 UTC m=+182.075242524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.447589 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t65x9"] Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.448511 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" event={"ID":"f1401406-9964-4c43-8192-efd2f32732e5","Type":"ContainerStarted","Data":"2acb99a85893b22f833084f00c9787096d2cbd090c9ae1c47257bcd9e1b1c843"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.448538 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" event={"ID":"f1401406-9964-4c43-8192-efd2f32732e5","Type":"ContainerStarted","Data":"6247ea263862a5963a71668c9c7db2d6d1cbfd8d92868fb7325173969b463e5c"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.448571 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kfw9l" podStartSLOduration=126.448549781 podStartE2EDuration="2m6.448549781s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:15.425745496 +0000 UTC m=+181.573640608" watchObservedRunningTime="2026-03-10 00:09:15.448549781 +0000 UTC m=+181.596444893" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.474787 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j4vpr" event={"ID":"220fec7e-5622-4a55-ab08-359de5d87c6f","Type":"ContainerStarted","Data":"0bac77c0903a63d22cd41ba05a5ee5e608a82ae50a527b6210ef43ed76402b02"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.511535 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" event={"ID":"0492b9b7-f88e-48ee-88e6-83aa55d8ce65","Type":"ContainerStarted","Data":"64597f622ff2f5792f882b1527eab2d6f3437f35784c901b087a19a1798ee983"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.528348 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-xzvmk" podStartSLOduration=126.528319357 podStartE2EDuration="2m6.528319357s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:15.498615537 +0000 UTC m=+181.646510649" watchObservedRunningTime="2026-03-10 00:09:15.528319357 +0000 UTC m=+181.676214459" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.528427 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.528629 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5331074a-1c86-455a-80e9-6f945936e218-catalog-content\") pod \"community-operators-t65x9\" (UID: \"5331074a-1c86-455a-80e9-6f945936e218\") " pod="openshift-marketplace/community-operators-t65x9" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.528705 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfk87\" (UniqueName: \"kubernetes.io/projected/5331074a-1c86-455a-80e9-6f945936e218-kube-api-access-gfk87\") pod \"community-operators-t65x9\" (UID: \"5331074a-1c86-455a-80e9-6f945936e218\") " pod="openshift-marketplace/community-operators-t65x9" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.528784 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5331074a-1c86-455a-80e9-6f945936e218-utilities\") pod \"community-operators-t65x9\" (UID: \"5331074a-1c86-455a-80e9-6f945936e218\") " pod="openshift-marketplace/community-operators-t65x9" Mar 10 00:09:15 crc kubenswrapper[4906]: E0310 00:09:15.529882 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:16.02986342 +0000 UTC m=+182.177758532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.551815 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-gtbds" podStartSLOduration=126.5517829 podStartE2EDuration="2m6.5517829s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:15.538196496 +0000 UTC m=+181.686091598" watchObservedRunningTime="2026-03-10 00:09:15.5517829 +0000 UTC m=+181.699678002" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.589991 4906 patch_prober.go:28] interesting pod/router-default-5444994796-kpmwl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:15 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Mar 10 00:09:15 crc kubenswrapper[4906]: [+]process-running ok Mar 10 00:09:15 crc kubenswrapper[4906]: healthz check failed Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.590057 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kpmwl" podUID="6bbf8e83-d583-49de-a12c-6f0a3953dc67" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.590971 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" event={"ID":"7ef57cce-ba04-4605-a245-edfad55f6f69","Type":"ContainerStarted","Data":"4eda1bd6eec1f012c914f3732c3ae6ce49ec7873dc85e4d2725e245e817615e9"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.632948 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5331074a-1c86-455a-80e9-6f945936e218-utilities\") pod \"community-operators-t65x9\" (UID: \"5331074a-1c86-455a-80e9-6f945936e218\") " pod="openshift-marketplace/community-operators-t65x9" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.633038 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.633162 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5331074a-1c86-455a-80e9-6f945936e218-catalog-content\") pod \"community-operators-t65x9\" (UID: \"5331074a-1c86-455a-80e9-6f945936e218\") " pod="openshift-marketplace/community-operators-t65x9" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.633208 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfk87\" (UniqueName: \"kubernetes.io/projected/5331074a-1c86-455a-80e9-6f945936e218-kube-api-access-gfk87\") pod \"community-operators-t65x9\" (UID: \"5331074a-1c86-455a-80e9-6f945936e218\") " pod="openshift-marketplace/community-operators-t65x9" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.635368 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5331074a-1c86-455a-80e9-6f945936e218-utilities\") pod \"community-operators-t65x9\" (UID: \"5331074a-1c86-455a-80e9-6f945936e218\") " pod="openshift-marketplace/community-operators-t65x9" Mar 10 00:09:15 crc kubenswrapper[4906]: E0310 00:09:15.640428 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:16.140407936 +0000 UTC m=+182.288303048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.656892 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5331074a-1c86-455a-80e9-6f945936e218-catalog-content\") pod \"community-operators-t65x9\" (UID: \"5331074a-1c86-455a-80e9-6f945936e218\") " pod="openshift-marketplace/community-operators-t65x9" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.659132 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgvms" event={"ID":"21590dce-fca3-4e23-8fd7-ade8be24206c","Type":"ContainerStarted","Data":"2e923606bb590fc1834f03f6a0f7b1630205a47f9581245f6be26e57838a6831"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.673291 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nvvhx"] Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.674550 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvvhx" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.698010 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfk87\" (UniqueName: \"kubernetes.io/projected/5331074a-1c86-455a-80e9-6f945936e218-kube-api-access-gfk87\") pod \"community-operators-t65x9\" (UID: \"5331074a-1c86-455a-80e9-6f945936e218\") " pod="openshift-marketplace/community-operators-t65x9" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.710044 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nvvhx"] Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.718329 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b4rmk" event={"ID":"380fb2b7-9fcf-4c1a-879a-a4a6195f4e58","Type":"ContainerStarted","Data":"15d65bfebfcf9151756b063dfebab2244374d25c9fbd4ffd88b32aa71aa3c2cd"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.718677 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.728453 4906 ???:1] "http: TLS handshake error from 192.168.126.11:39600: no serving certificate available for the kubelet" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.735159 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:15 crc kubenswrapper[4906]: E0310 00:09:15.736593 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:16.236572835 +0000 UTC m=+182.384467937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.746050 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t65x9" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.778756 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2" event={"ID":"06401874-a064-42cf-b1db-0e5bff007b1c","Type":"ContainerStarted","Data":"b8df2468bce30df00bd2820acf62a254d75706efe7e37a185e414d745719784a"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.780018 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.820877 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fhmhd"] Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.821871 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" event={"ID":"6b23a6fe-6dcc-4d84-8591-31079d563929","Type":"ContainerStarted","Data":"00864db50af9aae03d8f4eb7ec3f54fa2d642a7a5d9650cbea068a024c654864"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.821957 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhmhd" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.842557 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.842648 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc4d2e8f-54ca-464b-b186-432747b22864-catalog-content\") pod \"certified-operators-nvvhx\" (UID: \"dc4d2e8f-54ca-464b-b186-432747b22864\") " pod="openshift-marketplace/certified-operators-nvvhx" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.842703 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4f97\" (UniqueName: \"kubernetes.io/projected/dc4d2e8f-54ca-464b-b186-432747b22864-kube-api-access-j4f97\") pod \"certified-operators-nvvhx\" (UID: \"dc4d2e8f-54ca-464b-b186-432747b22864\") " pod="openshift-marketplace/certified-operators-nvvhx" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.842745 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc4d2e8f-54ca-464b-b186-432747b22864-utilities\") pod \"certified-operators-nvvhx\" (UID: \"dc4d2e8f-54ca-464b-b186-432747b22864\") " pod="openshift-marketplace/certified-operators-nvvhx" Mar 10 00:09:15 crc kubenswrapper[4906]: E0310 00:09:15.843145 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:16.343130948 +0000 UTC m=+182.491026060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.852544 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" podStartSLOduration=127.852514513 podStartE2EDuration="2m7.852514513s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:15.843171949 +0000 UTC m=+181.991067061" watchObservedRunningTime="2026-03-10 00:09:15.852514513 +0000 UTC m=+182.000409625" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.854870 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qnppg" event={"ID":"cff04257-cb6e-44cf-93de-f4e2cdf6698d","Type":"ContainerStarted","Data":"a4cdf03412834570fb42494e3992a7914c37ea5301c6f03c11162b8c65c202a3"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.866828 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.878041 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qfc95" event={"ID":"9e40ce76-0ae5-42cc-9670-a40bb4b4e2e4","Type":"ContainerStarted","Data":"30b951e156dfdbb9cd41ebb1f1a7e18b5bbd46874bd0a27adfbf22acd9edf7a0"} Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.882173 4906 patch_prober.go:28] interesting pod/downloads-7954f5f757-zkqrr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.882238 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zkqrr" podUID="1c418134-7dc2-42f5-b1ce-c22a2b77a4b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.898124 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhmhd"] Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.942944 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgvms" podStartSLOduration=126.942920359 podStartE2EDuration="2m6.942920359s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:15.922051579 +0000 UTC m=+182.069946691" watchObservedRunningTime="2026-03-10 00:09:15.942920359 +0000 UTC m=+182.090815471" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.943942 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:15 crc kubenswrapper[4906]: E0310 00:09:15.944099 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:16.444087512 +0000 UTC m=+182.591982624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.954758 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52694cc4-226b-4bd5-a6c7-0ebd711926e2-catalog-content\") pod \"community-operators-fhmhd\" (UID: \"52694cc4-226b-4bd5-a6c7-0ebd711926e2\") " pod="openshift-marketplace/community-operators-fhmhd" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.954977 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.955045 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc4d2e8f-54ca-464b-b186-432747b22864-catalog-content\") pod \"certified-operators-nvvhx\" (UID: \"dc4d2e8f-54ca-464b-b186-432747b22864\") " pod="openshift-marketplace/certified-operators-nvvhx" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.955117 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4f97\" (UniqueName: \"kubernetes.io/projected/dc4d2e8f-54ca-464b-b186-432747b22864-kube-api-access-j4f97\") pod \"certified-operators-nvvhx\" (UID: \"dc4d2e8f-54ca-464b-b186-432747b22864\") " pod="openshift-marketplace/certified-operators-nvvhx" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.955148 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxrkm\" (UniqueName: \"kubernetes.io/projected/52694cc4-226b-4bd5-a6c7-0ebd711926e2-kube-api-access-gxrkm\") pod \"community-operators-fhmhd\" (UID: \"52694cc4-226b-4bd5-a6c7-0ebd711926e2\") " pod="openshift-marketplace/community-operators-fhmhd" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.955191 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc4d2e8f-54ca-464b-b186-432747b22864-utilities\") pod \"certified-operators-nvvhx\" (UID: \"dc4d2e8f-54ca-464b-b186-432747b22864\") " pod="openshift-marketplace/certified-operators-nvvhx" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.955237 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52694cc4-226b-4bd5-a6c7-0ebd711926e2-utilities\") pod \"community-operators-fhmhd\" (UID: \"52694cc4-226b-4bd5-a6c7-0ebd711926e2\") " pod="openshift-marketplace/community-operators-fhmhd" Mar 10 00:09:15 crc kubenswrapper[4906]: E0310 00:09:15.957549 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:16.457536452 +0000 UTC m=+182.605431564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.958055 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc4d2e8f-54ca-464b-b186-432747b22864-catalog-content\") pod \"certified-operators-nvvhx\" (UID: \"dc4d2e8f-54ca-464b-b186-432747b22864\") " pod="openshift-marketplace/certified-operators-nvvhx" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.958189 4906 ???:1] "http: TLS handshake error from 192.168.126.11:39604: no serving certificate available for the kubelet" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.958602 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc4d2e8f-54ca-464b-b186-432747b22864-utilities\") pod \"certified-operators-nvvhx\" (UID: \"dc4d2e8f-54ca-464b-b186-432747b22864\") " pod="openshift-marketplace/certified-operators-nvvhx" Mar 10 00:09:15 crc kubenswrapper[4906]: I0310 00:09:15.986461 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wcpw2" podStartSLOduration=126.98644462 podStartE2EDuration="2m6.98644462s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:15.984059942 +0000 UTC m=+182.131955144" watchObservedRunningTime="2026-03-10 00:09:15.98644462 +0000 UTC m=+182.134339732" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.053088 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jbwl2"] Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.082821 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbwl2" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.090064 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.090577 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52694cc4-226b-4bd5-a6c7-0ebd711926e2-catalog-content\") pod \"community-operators-fhmhd\" (UID: \"52694cc4-226b-4bd5-a6c7-0ebd711926e2\") " pod="openshift-marketplace/community-operators-fhmhd" Mar 10 00:09:16 crc kubenswrapper[4906]: E0310 00:09:16.091311 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:16.591255953 +0000 UTC m=+182.739151065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.091923 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52694cc4-226b-4bd5-a6c7-0ebd711926e2-catalog-content\") pod \"community-operators-fhmhd\" (UID: \"52694cc4-226b-4bd5-a6c7-0ebd711926e2\") " pod="openshift-marketplace/community-operators-fhmhd" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.096073 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jbwl2"] Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.096135 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.096485 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxrkm\" (UniqueName: \"kubernetes.io/projected/52694cc4-226b-4bd5-a6c7-0ebd711926e2-kube-api-access-gxrkm\") pod \"community-operators-fhmhd\" (UID: \"52694cc4-226b-4bd5-a6c7-0ebd711926e2\") " pod="openshift-marketplace/community-operators-fhmhd" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.096822 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52694cc4-226b-4bd5-a6c7-0ebd711926e2-utilities\") pod \"community-operators-fhmhd\" (UID: \"52694cc4-226b-4bd5-a6c7-0ebd711926e2\") " pod="openshift-marketplace/community-operators-fhmhd" Mar 10 00:09:16 crc kubenswrapper[4906]: E0310 00:09:16.104220 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:16.604199049 +0000 UTC m=+182.752094161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.107351 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52694cc4-226b-4bd5-a6c7-0ebd711926e2-utilities\") pod \"community-operators-fhmhd\" (UID: \"52694cc4-226b-4bd5-a6c7-0ebd711926e2\") " pod="openshift-marketplace/community-operators-fhmhd" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.108199 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4f97\" (UniqueName: \"kubernetes.io/projected/dc4d2e8f-54ca-464b-b186-432747b22864-kube-api-access-j4f97\") pod \"certified-operators-nvvhx\" (UID: \"dc4d2e8f-54ca-464b-b186-432747b22864\") " pod="openshift-marketplace/certified-operators-nvvhx" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.131363 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b4rmk" podStartSLOduration=9.131341267 podStartE2EDuration="9.131341267s" podCreationTimestamp="2026-03-10 00:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.128337582 +0000 UTC m=+182.276232694" watchObservedRunningTime="2026-03-10 00:09:16.131341267 +0000 UTC m=+182.279236379" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.163752 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxrkm\" (UniqueName: \"kubernetes.io/projected/52694cc4-226b-4bd5-a6c7-0ebd711926e2-kube-api-access-gxrkm\") pod \"community-operators-fhmhd\" (UID: \"52694cc4-226b-4bd5-a6c7-0ebd711926e2\") " pod="openshift-marketplace/community-operators-fhmhd" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.200190 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.200442 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8576\" (UniqueName: \"kubernetes.io/projected/bfd0c098-c58b-456c-a9b2-270e749bc274-kube-api-access-d8576\") pod \"certified-operators-jbwl2\" (UID: \"bfd0c098-c58b-456c-a9b2-270e749bc274\") " pod="openshift-marketplace/certified-operators-jbwl2" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.200491 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfd0c098-c58b-456c-a9b2-270e749bc274-catalog-content\") pod \"certified-operators-jbwl2\" (UID: \"bfd0c098-c58b-456c-a9b2-270e749bc274\") " pod="openshift-marketplace/certified-operators-jbwl2" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.200518 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfd0c098-c58b-456c-a9b2-270e749bc274-utilities\") pod \"certified-operators-jbwl2\" (UID: \"bfd0c098-c58b-456c-a9b2-270e749bc274\") " pod="openshift-marketplace/certified-operators-jbwl2" Mar 10 00:09:16 crc kubenswrapper[4906]: E0310 00:09:16.200693 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:16.700672387 +0000 UTC m=+182.848567499 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.228078 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhmhd" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.234098 4906 ???:1] "http: TLS handshake error from 192.168.126.11:39620: no serving certificate available for the kubelet" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.251800 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v25gg"] Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.256334 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt"] Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.302457 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.302554 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8576\" (UniqueName: \"kubernetes.io/projected/bfd0c098-c58b-456c-a9b2-270e749bc274-kube-api-access-d8576\") pod \"certified-operators-jbwl2\" (UID: \"bfd0c098-c58b-456c-a9b2-270e749bc274\") " pod="openshift-marketplace/certified-operators-jbwl2" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.302809 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfd0c098-c58b-456c-a9b2-270e749bc274-catalog-content\") pod \"certified-operators-jbwl2\" (UID: \"bfd0c098-c58b-456c-a9b2-270e749bc274\") " pod="openshift-marketplace/certified-operators-jbwl2" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.302838 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfd0c098-c58b-456c-a9b2-270e749bc274-utilities\") pod \"certified-operators-jbwl2\" (UID: \"bfd0c098-c58b-456c-a9b2-270e749bc274\") " pod="openshift-marketplace/certified-operators-jbwl2" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.303286 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfd0c098-c58b-456c-a9b2-270e749bc274-utilities\") pod \"certified-operators-jbwl2\" (UID: \"bfd0c098-c58b-456c-a9b2-270e749bc274\") " pod="openshift-marketplace/certified-operators-jbwl2" Mar 10 00:09:16 crc kubenswrapper[4906]: E0310 00:09:16.303819 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:16.803803003 +0000 UTC m=+182.951698115 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.307176 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfd0c098-c58b-456c-a9b2-270e749bc274-catalog-content\") pod \"certified-operators-jbwl2\" (UID: \"bfd0c098-c58b-456c-a9b2-270e749bc274\") " pod="openshift-marketplace/certified-operators-jbwl2" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.330834 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" podStartSLOduration=128.330817977 podStartE2EDuration="2m8.330817977s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.328051438 +0000 UTC m=+182.475946550" watchObservedRunningTime="2026-03-10 00:09:16.330817977 +0000 UTC m=+182.478713089" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.331572 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvvhx" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.378688 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8576\" (UniqueName: \"kubernetes.io/projected/bfd0c098-c58b-456c-a9b2-270e749bc274-kube-api-access-d8576\") pod \"certified-operators-jbwl2\" (UID: \"bfd0c098-c58b-456c-a9b2-270e749bc274\") " pod="openshift-marketplace/certified-operators-jbwl2" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.405663 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:16 crc kubenswrapper[4906]: E0310 00:09:16.407182 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:16.907159405 +0000 UTC m=+183.055054517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.407329 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:16 crc kubenswrapper[4906]: E0310 00:09:16.407694 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:16.90768702 +0000 UTC m=+183.055582132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.430194 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbwl2" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.439163 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qgdqx" podStartSLOduration=127.43914716 podStartE2EDuration="2m7.43914716s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.436184606 +0000 UTC m=+182.584079718" watchObservedRunningTime="2026-03-10 00:09:16.43914716 +0000 UTC m=+182.587042272" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.440243 4906 ???:1] "http: TLS handshake error from 192.168.126.11:39632: no serving certificate available for the kubelet" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.510149 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" podStartSLOduration=127.510125327 podStartE2EDuration="2m7.510125327s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.50988782 +0000 UTC m=+182.657782932" watchObservedRunningTime="2026-03-10 00:09:16.510125327 +0000 UTC m=+182.658020439" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.510852 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:16 crc kubenswrapper[4906]: E0310 00:09:16.511419 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:17.011401833 +0000 UTC m=+183.159296935 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.565039 4906 ???:1] "http: TLS handshake error from 192.168.126.11:39646: no serving certificate available for the kubelet" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.579892 4906 patch_prober.go:28] interesting pod/router-default-5444994796-kpmwl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:16 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Mar 10 00:09:16 crc kubenswrapper[4906]: [+]process-running ok Mar 10 00:09:16 crc kubenswrapper[4906]: healthz check failed Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.579957 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kpmwl" podUID="6bbf8e83-d583-49de-a12c-6f0a3953dc67" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.618257 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:16 crc kubenswrapper[4906]: E0310 00:09:16.618717 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:17.118694556 +0000 UTC m=+183.266589668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.652166 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-qnppg" podStartSLOduration=128.652142082 podStartE2EDuration="2m8.652142082s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.620272861 +0000 UTC m=+182.768167973" watchObservedRunningTime="2026-03-10 00:09:16.652142082 +0000 UTC m=+182.800037194" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.653926 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sjjvc" podStartSLOduration=128.653920172 podStartE2EDuration="2m8.653920172s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:16.652712868 +0000 UTC m=+182.800607980" watchObservedRunningTime="2026-03-10 00:09:16.653920172 +0000 UTC m=+182.801815284" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.676038 4906 ???:1] "http: TLS handshake error from 192.168.126.11:39654: no serving certificate available for the kubelet" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.719431 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:16 crc kubenswrapper[4906]: E0310 00:09:16.719921 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:17.219904138 +0000 UTC m=+183.367799250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.778038 4906 ???:1] "http: TLS handshake error from 192.168.126.11:39660: no serving certificate available for the kubelet" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.827242 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:16 crc kubenswrapper[4906]: E0310 00:09:16.827686 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:17.327672705 +0000 UTC m=+183.475567817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.835864 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t65x9"] Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.881730 4906 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rlnd7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.881799 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" podUID="332da74f-063b-48aa-8b86-8646fddcadda" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.923922 4906 ???:1] "http: TLS handshake error from 192.168.126.11:39664: no serving certificate available for the kubelet" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.925740 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" event={"ID":"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb","Type":"ContainerStarted","Data":"c17dead120aa3b70f41ee4d82f731911a0aca6973c2e39e3965f967564501231"} Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.928007 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:16 crc kubenswrapper[4906]: E0310 00:09:16.928275 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:17.428258019 +0000 UTC m=+183.576153131 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.977060 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j4vpr" event={"ID":"220fec7e-5622-4a55-ab08-359de5d87c6f","Type":"ContainerStarted","Data":"7c43fec7745b8298731912c74346c5e739d9555bc6eb8759c360be9d1a8d6d26"} Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.981819 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" podUID="85226c05-ab07-4243-89af-c58b7c3d1f43" containerName="controller-manager" containerID="cri-o://a18807da7208beee65e915c4c8e80b88ace1d81d39610d4f7625ca5af530b46f" gracePeriod=30 Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.983398 4906 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vw9rg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.983426 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" podUID="a46863f6-02e5-4d35-8b59-216377e41403" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.983426 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" podUID="ebf8d4f5-6995-45dd-be49-491ced904443" containerName="route-controller-manager" containerID="cri-o://513f92dd87b50099db2ffba140f2dc61b7bc0eb88993d0065419e3038be1bf08" gracePeriod=30 Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.983928 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-j4vpr" Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.986124 4906 patch_prober.go:28] interesting pod/downloads-7954f5f757-zkqrr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 10 00:09:16 crc kubenswrapper[4906]: I0310 00:09:16.986160 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zkqrr" podUID="1c418134-7dc2-42f5-b1ce-c22a2b77a4b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.000106 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rlnd7" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.007721 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-j4vpr" podStartSLOduration=10.007698585 podStartE2EDuration="10.007698585s" podCreationTimestamp="2026-03-10 00:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:17.00612457 +0000 UTC m=+183.154019682" watchObservedRunningTime="2026-03-10 00:09:17.007698585 +0000 UTC m=+183.155593697" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.030461 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:17 crc kubenswrapper[4906]: E0310 00:09:17.034663 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:17.534648217 +0000 UTC m=+183.682543329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.092423 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhmhd"] Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.132450 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:17 crc kubenswrapper[4906]: E0310 00:09:17.132768 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:17.632742331 +0000 UTC m=+183.780637443 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.132865 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:17 crc kubenswrapper[4906]: E0310 00:09:17.133284 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:17.633275616 +0000 UTC m=+183.781170728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.163428 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nvvhx"] Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.178026 4906 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vt2mt container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.178093 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" podUID="7f9945a5-74c6-482d-b4c4-1097e3efebe0" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.178770 4906 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-vt2mt container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.178794 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" podUID="7f9945a5-74c6-482d-b4c4-1097e3efebe0" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 00:09:17 crc kubenswrapper[4906]: W0310 00:09:17.224352 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc4d2e8f_54ca_464b_b186_432747b22864.slice/crio-90e818fe2518b16ed23dc1742efc9c1c19ed36092fcfa2efd1dd7db2ca51273c WatchSource:0}: Error finding container 90e818fe2518b16ed23dc1742efc9c1c19ed36092fcfa2efd1dd7db2ca51273c: Status 404 returned error can't find the container with id 90e818fe2518b16ed23dc1742efc9c1c19ed36092fcfa2efd1dd7db2ca51273c Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.234325 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:17 crc kubenswrapper[4906]: E0310 00:09:17.234704 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:17.734686142 +0000 UTC m=+183.882581254 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.318728 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jbwl2"] Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.335415 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:17 crc kubenswrapper[4906]: E0310 00:09:17.335857 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:17.835843572 +0000 UTC m=+183.983738684 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:17 crc kubenswrapper[4906]: W0310 00:09:17.368395 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfd0c098_c58b_456c_a9b2_270e749bc274.slice/crio-feef254d9c7c35598955de9ae64d74d525c038cba7aaedbab6d167176c2d656a WatchSource:0}: Error finding container feef254d9c7c35598955de9ae64d74d525c038cba7aaedbab6d167176c2d656a: Status 404 returned error can't find the container with id feef254d9c7c35598955de9ae64d74d525c038cba7aaedbab6d167176c2d656a Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.441025 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.441546 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs\") pod \"network-metrics-daemon-5bn7b\" (UID: \"55d944ce-605e-41a7-9211-a5bc388145f1\") " pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:09:17 crc kubenswrapper[4906]: E0310 00:09:17.441802 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:17.941783798 +0000 UTC m=+184.089678910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.445079 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.464243 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55d944ce-605e-41a7-9211-a5bc388145f1-metrics-certs\") pod \"network-metrics-daemon-5bn7b\" (UID: \"55d944ce-605e-41a7-9211-a5bc388145f1\") " pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.542546 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:17 crc kubenswrapper[4906]: E0310 00:09:17.543034 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:18.04301457 +0000 UTC m=+184.190909682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.580071 4906 patch_prober.go:28] interesting pod/router-default-5444994796-kpmwl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:17 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Mar 10 00:09:17 crc kubenswrapper[4906]: [+]process-running ok Mar 10 00:09:17 crc kubenswrapper[4906]: healthz check failed Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.580132 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kpmwl" podUID="6bbf8e83-d583-49de-a12c-6f0a3953dc67" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.598751 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-whmcg"] Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.603747 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whmcg" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.621037 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.643311 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.643588 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1beb2f4-c1c5-488d-8c76-bed30174a0de-catalog-content\") pod \"redhat-marketplace-whmcg\" (UID: \"f1beb2f4-c1c5-488d-8c76-bed30174a0de\") " pod="openshift-marketplace/redhat-marketplace-whmcg" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.643617 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvbd9\" (UniqueName: \"kubernetes.io/projected/f1beb2f4-c1c5-488d-8c76-bed30174a0de-kube-api-access-zvbd9\") pod \"redhat-marketplace-whmcg\" (UID: \"f1beb2f4-c1c5-488d-8c76-bed30174a0de\") " pod="openshift-marketplace/redhat-marketplace-whmcg" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.643712 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1beb2f4-c1c5-488d-8c76-bed30174a0de-utilities\") pod \"redhat-marketplace-whmcg\" (UID: \"f1beb2f4-c1c5-488d-8c76-bed30174a0de\") " pod="openshift-marketplace/redhat-marketplace-whmcg" Mar 10 00:09:17 crc kubenswrapper[4906]: E0310 00:09:17.643835 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:18.14381661 +0000 UTC m=+184.291711722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.644538 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-whmcg"] Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.708005 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.711972 4906 ???:1] "http: TLS handshake error from 192.168.126.11:39672: no serving certificate available for the kubelet" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.714196 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5bn7b" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.748955 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1beb2f4-c1c5-488d-8c76-bed30174a0de-catalog-content\") pod \"redhat-marketplace-whmcg\" (UID: \"f1beb2f4-c1c5-488d-8c76-bed30174a0de\") " pod="openshift-marketplace/redhat-marketplace-whmcg" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.748996 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvbd9\" (UniqueName: \"kubernetes.io/projected/f1beb2f4-c1c5-488d-8c76-bed30174a0de-kube-api-access-zvbd9\") pod \"redhat-marketplace-whmcg\" (UID: \"f1beb2f4-c1c5-488d-8c76-bed30174a0de\") " pod="openshift-marketplace/redhat-marketplace-whmcg" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.749067 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1beb2f4-c1c5-488d-8c76-bed30174a0de-utilities\") pod \"redhat-marketplace-whmcg\" (UID: \"f1beb2f4-c1c5-488d-8c76-bed30174a0de\") " pod="openshift-marketplace/redhat-marketplace-whmcg" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.749107 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:17 crc kubenswrapper[4906]: E0310 00:09:17.749436 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:18.249420426 +0000 UTC m=+184.397315538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.749987 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1beb2f4-c1c5-488d-8c76-bed30174a0de-catalog-content\") pod \"redhat-marketplace-whmcg\" (UID: \"f1beb2f4-c1c5-488d-8c76-bed30174a0de\") " pod="openshift-marketplace/redhat-marketplace-whmcg" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.752377 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1beb2f4-c1c5-488d-8c76-bed30174a0de-utilities\") pod \"redhat-marketplace-whmcg\" (UID: \"f1beb2f4-c1c5-488d-8c76-bed30174a0de\") " pod="openshift-marketplace/redhat-marketplace-whmcg" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.761993 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vt2mt" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.784762 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvbd9\" (UniqueName: \"kubernetes.io/projected/f1beb2f4-c1c5-488d-8c76-bed30174a0de-kube-api-access-zvbd9\") pod \"redhat-marketplace-whmcg\" (UID: \"f1beb2f4-c1c5-488d-8c76-bed30174a0de\") " pod="openshift-marketplace/redhat-marketplace-whmcg" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.851799 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:17 crc kubenswrapper[4906]: E0310 00:09:17.852204 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:18.352182792 +0000 UTC m=+184.500077904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.870463 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.942763 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whmcg" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.958537 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebf8d4f5-6995-45dd-be49-491ced904443-serving-cert\") pod \"ebf8d4f5-6995-45dd-be49-491ced904443\" (UID: \"ebf8d4f5-6995-45dd-be49-491ced904443\") " Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.958751 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebf8d4f5-6995-45dd-be49-491ced904443-config\") pod \"ebf8d4f5-6995-45dd-be49-491ced904443\" (UID: \"ebf8d4f5-6995-45dd-be49-491ced904443\") " Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.958777 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebf8d4f5-6995-45dd-be49-491ced904443-client-ca\") pod \"ebf8d4f5-6995-45dd-be49-491ced904443\" (UID: \"ebf8d4f5-6995-45dd-be49-491ced904443\") " Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.958801 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnkvl\" (UniqueName: \"kubernetes.io/projected/ebf8d4f5-6995-45dd-be49-491ced904443-kube-api-access-jnkvl\") pod \"ebf8d4f5-6995-45dd-be49-491ced904443\" (UID: \"ebf8d4f5-6995-45dd-be49-491ced904443\") " Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.958916 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:17 crc kubenswrapper[4906]: E0310 00:09:17.959283 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:18.459270729 +0000 UTC m=+184.607165841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.960695 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebf8d4f5-6995-45dd-be49-491ced904443-client-ca" (OuterVolumeSpecName: "client-ca") pod "ebf8d4f5-6995-45dd-be49-491ced904443" (UID: "ebf8d4f5-6995-45dd-be49-491ced904443"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.961109 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebf8d4f5-6995-45dd-be49-491ced904443-config" (OuterVolumeSpecName: "config") pod "ebf8d4f5-6995-45dd-be49-491ced904443" (UID: "ebf8d4f5-6995-45dd-be49-491ced904443"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.965318 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebf8d4f5-6995-45dd-be49-491ced904443-kube-api-access-jnkvl" (OuterVolumeSpecName: "kube-api-access-jnkvl") pod "ebf8d4f5-6995-45dd-be49-491ced904443" (UID: "ebf8d4f5-6995-45dd-be49-491ced904443"). InnerVolumeSpecName "kube-api-access-jnkvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.978525 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebf8d4f5-6995-45dd-be49-491ced904443-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ebf8d4f5-6995-45dd-be49-491ced904443" (UID: "ebf8d4f5-6995-45dd-be49-491ced904443"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.979543 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2"] Mar 10 00:09:17 crc kubenswrapper[4906]: E0310 00:09:17.979832 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf8d4f5-6995-45dd-be49-491ced904443" containerName="route-controller-manager" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.979847 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf8d4f5-6995-45dd-be49-491ced904443" containerName="route-controller-manager" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.979945 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf8d4f5-6995-45dd-be49-491ced904443" containerName="route-controller-manager" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.980507 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.984591 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-js2xl"] Mar 10 00:09:17 crc kubenswrapper[4906]: I0310 00:09:17.992373 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-js2xl" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.001777 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvvhx" event={"ID":"dc4d2e8f-54ca-464b-b186-432747b22864","Type":"ContainerStarted","Data":"90e818fe2518b16ed23dc1742efc9c1c19ed36092fcfa2efd1dd7db2ca51273c"} Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.014144 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2"] Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.037400 4906 generic.go:334] "Generic (PLEG): container finished" podID="5331074a-1c86-455a-80e9-6f945936e218" containerID="e57b6c2d2f8a97e0b7c0aa7eba8757eb9df4cb672e4128d3e93135e5f36c7d36" exitCode=0 Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.037482 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t65x9" event={"ID":"5331074a-1c86-455a-80e9-6f945936e218","Type":"ContainerDied","Data":"e57b6c2d2f8a97e0b7c0aa7eba8757eb9df4cb672e4128d3e93135e5f36c7d36"} Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.037509 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t65x9" event={"ID":"5331074a-1c86-455a-80e9-6f945936e218","Type":"ContainerStarted","Data":"44f72a4b931b74aab5af064c0720b88108e9acdd89a9d3ca2815356148c420ef"} Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.059651 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.059747 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/752d09a7-ead6-439b-a35c-6abc6a00afdb-client-ca\") pod \"route-controller-manager-f5fcb6c78-2jlb2\" (UID: \"752d09a7-ead6-439b-a35c-6abc6a00afdb\") " pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.059773 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnwwc\" (UniqueName: \"kubernetes.io/projected/752d09a7-ead6-439b-a35c-6abc6a00afdb-kube-api-access-lnwwc\") pod \"route-controller-manager-f5fcb6c78-2jlb2\" (UID: \"752d09a7-ead6-439b-a35c-6abc6a00afdb\") " pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.059834 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a68ed18-11e8-4943-854e-a8e4a5566313-catalog-content\") pod \"redhat-marketplace-js2xl\" (UID: \"9a68ed18-11e8-4943-854e-a8e4a5566313\") " pod="openshift-marketplace/redhat-marketplace-js2xl" Mar 10 00:09:18 crc kubenswrapper[4906]: E0310 00:09:18.059875 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:18.559839913 +0000 UTC m=+184.707735025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.059919 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/752d09a7-ead6-439b-a35c-6abc6a00afdb-serving-cert\") pod \"route-controller-manager-f5fcb6c78-2jlb2\" (UID: \"752d09a7-ead6-439b-a35c-6abc6a00afdb\") " pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.060254 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a68ed18-11e8-4943-854e-a8e4a5566313-utilities\") pod \"redhat-marketplace-js2xl\" (UID: \"9a68ed18-11e8-4943-854e-a8e4a5566313\") " pod="openshift-marketplace/redhat-marketplace-js2xl" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.060926 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/752d09a7-ead6-439b-a35c-6abc6a00afdb-config\") pod \"route-controller-manager-f5fcb6c78-2jlb2\" (UID: \"752d09a7-ead6-439b-a35c-6abc6a00afdb\") " pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.061015 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.061036 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w58s7\" (UniqueName: \"kubernetes.io/projected/9a68ed18-11e8-4943-854e-a8e4a5566313-kube-api-access-w58s7\") pod \"redhat-marketplace-js2xl\" (UID: \"9a68ed18-11e8-4943-854e-a8e4a5566313\") " pod="openshift-marketplace/redhat-marketplace-js2xl" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.061054 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhmhd" event={"ID":"52694cc4-226b-4bd5-a6c7-0ebd711926e2","Type":"ContainerDied","Data":"6cb827060cebc472bb56218c6c152feb359da7d8ec87352ef260da8e18bf5232"} Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.061108 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnkvl\" (UniqueName: \"kubernetes.io/projected/ebf8d4f5-6995-45dd-be49-491ced904443-kube-api-access-jnkvl\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.061122 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebf8d4f5-6995-45dd-be49-491ced904443-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.061133 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebf8d4f5-6995-45dd-be49-491ced904443-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.061145 4906 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebf8d4f5-6995-45dd-be49-491ced904443-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.061026 4906 generic.go:334] "Generic (PLEG): container finished" podID="52694cc4-226b-4bd5-a6c7-0ebd711926e2" containerID="6cb827060cebc472bb56218c6c152feb359da7d8ec87352ef260da8e18bf5232" exitCode=0 Mar 10 00:09:18 crc kubenswrapper[4906]: E0310 00:09:18.062151 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:18.562135758 +0000 UTC m=+184.710030870 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.067245 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhmhd" event={"ID":"52694cc4-226b-4bd5-a6c7-0ebd711926e2","Type":"ContainerStarted","Data":"531e6a51b8153b56cfa30ec89f62b24a5d768da45e272b94e7f98afec3db106f"} Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.100347 4906 generic.go:334] "Generic (PLEG): container finished" podID="ebf8d4f5-6995-45dd-be49-491ced904443" containerID="513f92dd87b50099db2ffba140f2dc61b7bc0eb88993d0065419e3038be1bf08" exitCode=0 Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.100520 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.101005 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" event={"ID":"ebf8d4f5-6995-45dd-be49-491ced904443","Type":"ContainerDied","Data":"513f92dd87b50099db2ffba140f2dc61b7bc0eb88993d0065419e3038be1bf08"} Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.101060 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt" event={"ID":"ebf8d4f5-6995-45dd-be49-491ced904443","Type":"ContainerDied","Data":"1c2fee16554e2c72b5959c960a9110b2cb0dc7a027f9abbcd9c873cc6082a7ff"} Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.101079 4906 scope.go:117] "RemoveContainer" containerID="513f92dd87b50099db2ffba140f2dc61b7bc0eb88993d0065419e3038be1bf08" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.116954 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-js2xl"] Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.142861 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbwl2" event={"ID":"bfd0c098-c58b-456c-a9b2-270e749bc274","Type":"ContainerStarted","Data":"feef254d9c7c35598955de9ae64d74d525c038cba7aaedbab6d167176c2d656a"} Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.162473 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:18 crc kubenswrapper[4906]: E0310 00:09:18.163112 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:18.663073072 +0000 UTC m=+184.810968184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.163177 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.163232 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w58s7\" (UniqueName: \"kubernetes.io/projected/9a68ed18-11e8-4943-854e-a8e4a5566313-kube-api-access-w58s7\") pod \"redhat-marketplace-js2xl\" (UID: \"9a68ed18-11e8-4943-854e-a8e4a5566313\") " pod="openshift-marketplace/redhat-marketplace-js2xl" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.163304 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/752d09a7-ead6-439b-a35c-6abc6a00afdb-client-ca\") pod \"route-controller-manager-f5fcb6c78-2jlb2\" (UID: \"752d09a7-ead6-439b-a35c-6abc6a00afdb\") " pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.163338 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnwwc\" (UniqueName: \"kubernetes.io/projected/752d09a7-ead6-439b-a35c-6abc6a00afdb-kube-api-access-lnwwc\") pod \"route-controller-manager-f5fcb6c78-2jlb2\" (UID: \"752d09a7-ead6-439b-a35c-6abc6a00afdb\") " pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.163409 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a68ed18-11e8-4943-854e-a8e4a5566313-catalog-content\") pod \"redhat-marketplace-js2xl\" (UID: \"9a68ed18-11e8-4943-854e-a8e4a5566313\") " pod="openshift-marketplace/redhat-marketplace-js2xl" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.163432 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/752d09a7-ead6-439b-a35c-6abc6a00afdb-serving-cert\") pod \"route-controller-manager-f5fcb6c78-2jlb2\" (UID: \"752d09a7-ead6-439b-a35c-6abc6a00afdb\") " pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.163683 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a68ed18-11e8-4943-854e-a8e4a5566313-utilities\") pod \"redhat-marketplace-js2xl\" (UID: \"9a68ed18-11e8-4943-854e-a8e4a5566313\") " pod="openshift-marketplace/redhat-marketplace-js2xl" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.163765 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/752d09a7-ead6-439b-a35c-6abc6a00afdb-config\") pod \"route-controller-manager-f5fcb6c78-2jlb2\" (UID: \"752d09a7-ead6-439b-a35c-6abc6a00afdb\") " pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.165574 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/752d09a7-ead6-439b-a35c-6abc6a00afdb-client-ca\") pod \"route-controller-manager-f5fcb6c78-2jlb2\" (UID: \"752d09a7-ead6-439b-a35c-6abc6a00afdb\") " pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.166055 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/752d09a7-ead6-439b-a35c-6abc6a00afdb-config\") pod \"route-controller-manager-f5fcb6c78-2jlb2\" (UID: \"752d09a7-ead6-439b-a35c-6abc6a00afdb\") " pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.166325 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a68ed18-11e8-4943-854e-a8e4a5566313-catalog-content\") pod \"redhat-marketplace-js2xl\" (UID: \"9a68ed18-11e8-4943-854e-a8e4a5566313\") " pod="openshift-marketplace/redhat-marketplace-js2xl" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.166659 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a68ed18-11e8-4943-854e-a8e4a5566313-utilities\") pod \"redhat-marketplace-js2xl\" (UID: \"9a68ed18-11e8-4943-854e-a8e4a5566313\") " pod="openshift-marketplace/redhat-marketplace-js2xl" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.175932 4906 generic.go:334] "Generic (PLEG): container finished" podID="85226c05-ab07-4243-89af-c58b7c3d1f43" containerID="a18807da7208beee65e915c4c8e80b88ace1d81d39610d4f7625ca5af530b46f" exitCode=0 Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.176504 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" event={"ID":"85226c05-ab07-4243-89af-c58b7c3d1f43","Type":"ContainerDied","Data":"a18807da7208beee65e915c4c8e80b88ace1d81d39610d4f7625ca5af530b46f"} Mar 10 00:09:18 crc kubenswrapper[4906]: E0310 00:09:18.176957 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:18.676933444 +0000 UTC m=+184.824828556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.180385 4906 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vw9rg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.177116 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/752d09a7-ead6-439b-a35c-6abc6a00afdb-serving-cert\") pod \"route-controller-manager-f5fcb6c78-2jlb2\" (UID: \"752d09a7-ead6-439b-a35c-6abc6a00afdb\") " pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.180504 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" podUID="a46863f6-02e5-4d35-8b59-216377e41403" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.214465 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w58s7\" (UniqueName: \"kubernetes.io/projected/9a68ed18-11e8-4943-854e-a8e4a5566313-kube-api-access-w58s7\") pod \"redhat-marketplace-js2xl\" (UID: \"9a68ed18-11e8-4943-854e-a8e4a5566313\") " pod="openshift-marketplace/redhat-marketplace-js2xl" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.216531 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnwwc\" (UniqueName: \"kubernetes.io/projected/752d09a7-ead6-439b-a35c-6abc6a00afdb-kube-api-access-lnwwc\") pod \"route-controller-manager-f5fcb6c78-2jlb2\" (UID: \"752d09a7-ead6-439b-a35c-6abc6a00afdb\") " pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.246095 4906 scope.go:117] "RemoveContainer" containerID="513f92dd87b50099db2ffba140f2dc61b7bc0eb88993d0065419e3038be1bf08" Mar 10 00:09:18 crc kubenswrapper[4906]: E0310 00:09:18.254033 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"513f92dd87b50099db2ffba140f2dc61b7bc0eb88993d0065419e3038be1bf08\": container with ID starting with 513f92dd87b50099db2ffba140f2dc61b7bc0eb88993d0065419e3038be1bf08 not found: ID does not exist" containerID="513f92dd87b50099db2ffba140f2dc61b7bc0eb88993d0065419e3038be1bf08" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.254084 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"513f92dd87b50099db2ffba140f2dc61b7bc0eb88993d0065419e3038be1bf08"} err="failed to get container status \"513f92dd87b50099db2ffba140f2dc61b7bc0eb88993d0065419e3038be1bf08\": rpc error: code = NotFound desc = could not find container \"513f92dd87b50099db2ffba140f2dc61b7bc0eb88993d0065419e3038be1bf08\": container with ID starting with 513f92dd87b50099db2ffba140f2dc61b7bc0eb88993d0065419e3038be1bf08 not found: ID does not exist" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.264687 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.264926 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:18 crc kubenswrapper[4906]: E0310 00:09:18.265088 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:18.765070586 +0000 UTC m=+184.912965698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.265448 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt"] Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.274568 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-87hqt"] Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.342246 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.366176 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5h2s\" (UniqueName: \"kubernetes.io/projected/85226c05-ab07-4243-89af-c58b7c3d1f43-kube-api-access-f5h2s\") pod \"85226c05-ab07-4243-89af-c58b7c3d1f43\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.366280 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85226c05-ab07-4243-89af-c58b7c3d1f43-serving-cert\") pod \"85226c05-ab07-4243-89af-c58b7c3d1f43\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.366358 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-proxy-ca-bundles\") pod \"85226c05-ab07-4243-89af-c58b7c3d1f43\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.366400 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-client-ca\") pod \"85226c05-ab07-4243-89af-c58b7c3d1f43\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.366720 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-config\") pod \"85226c05-ab07-4243-89af-c58b7c3d1f43\" (UID: \"85226c05-ab07-4243-89af-c58b7c3d1f43\") " Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.367206 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:18 crc kubenswrapper[4906]: E0310 00:09:18.367623 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:18.867607385 +0000 UTC m=+185.015502497 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.371954 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-client-ca" (OuterVolumeSpecName: "client-ca") pod "85226c05-ab07-4243-89af-c58b7c3d1f43" (UID: "85226c05-ab07-4243-89af-c58b7c3d1f43"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.372910 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-config" (OuterVolumeSpecName: "config") pod "85226c05-ab07-4243-89af-c58b7c3d1f43" (UID: "85226c05-ab07-4243-89af-c58b7c3d1f43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.374022 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "85226c05-ab07-4243-89af-c58b7c3d1f43" (UID: "85226c05-ab07-4243-89af-c58b7c3d1f43"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.374707 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85226c05-ab07-4243-89af-c58b7c3d1f43-kube-api-access-f5h2s" (OuterVolumeSpecName: "kube-api-access-f5h2s") pod "85226c05-ab07-4243-89af-c58b7c3d1f43" (UID: "85226c05-ab07-4243-89af-c58b7c3d1f43"). InnerVolumeSpecName "kube-api-access-f5h2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.375869 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85226c05-ab07-4243-89af-c58b7c3d1f43-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "85226c05-ab07-4243-89af-c58b7c3d1f43" (UID: "85226c05-ab07-4243-89af-c58b7c3d1f43"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.419072 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-js2xl" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.422090 4906 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.445486 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-whmcg"] Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.470078 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.470349 4906 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.470364 4906 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.470374 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85226c05-ab07-4243-89af-c58b7c3d1f43-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.470386 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5h2s\" (UniqueName: \"kubernetes.io/projected/85226c05-ab07-4243-89af-c58b7c3d1f43-kube-api-access-f5h2s\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.470400 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85226c05-ab07-4243-89af-c58b7c3d1f43-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:18 crc kubenswrapper[4906]: E0310 00:09:18.470472 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 00:09:18.970452463 +0000 UTC m=+185.118347575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.536144 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5bn7b"] Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.554419 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nwlr8"] Mar 10 00:09:18 crc kubenswrapper[4906]: E0310 00:09:18.569021 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85226c05-ab07-4243-89af-c58b7c3d1f43" containerName="controller-manager" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.569055 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="85226c05-ab07-4243-89af-c58b7c3d1f43" containerName="controller-manager" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.569073 4906 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-10T00:09:18.422126027Z","Handler":null,"Name":""} Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.569169 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="85226c05-ab07-4243-89af-c58b7c3d1f43" containerName="controller-manager" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.571045 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwlr8" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.572664 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:18 crc kubenswrapper[4906]: E0310 00:09:18.573165 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 00:09:19.073153347 +0000 UTC m=+185.221048459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66jxp" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.577569 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.577900 4906 patch_prober.go:28] interesting pod/router-default-5444994796-kpmwl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:18 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Mar 10 00:09:18 crc kubenswrapper[4906]: [+]process-running ok Mar 10 00:09:18 crc kubenswrapper[4906]: healthz check failed Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.577930 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kpmwl" podUID="6bbf8e83-d583-49de-a12c-6f0a3953dc67" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.582442 4906 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.582492 4906 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.629908 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebf8d4f5-6995-45dd-be49-491ced904443" path="/var/lib/kubelet/pods/ebf8d4f5-6995-45dd-be49-491ced904443/volumes" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.631310 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nwlr8"] Mar 10 00:09:18 crc kubenswrapper[4906]: W0310 00:09:18.647846 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55d944ce_605e_41a7_9211_a5bc388145f1.slice/crio-598957f4afb1692fee861808789efccb86dce707e7eab23564cb8718d3e43976 WatchSource:0}: Error finding container 598957f4afb1692fee861808789efccb86dce707e7eab23564cb8718d3e43976: Status 404 returned error can't find the container with id 598957f4afb1692fee861808789efccb86dce707e7eab23564cb8718d3e43976 Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.677304 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.677571 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c8b80a-0af0-46cb-8a57-a353444de9dc-utilities\") pod \"redhat-operators-nwlr8\" (UID: \"68c8b80a-0af0-46cb-8a57-a353444de9dc\") " pod="openshift-marketplace/redhat-operators-nwlr8" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.677621 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c8b80a-0af0-46cb-8a57-a353444de9dc-catalog-content\") pod \"redhat-operators-nwlr8\" (UID: \"68c8b80a-0af0-46cb-8a57-a353444de9dc\") " pod="openshift-marketplace/redhat-operators-nwlr8" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.677682 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x59lh\" (UniqueName: \"kubernetes.io/projected/68c8b80a-0af0-46cb-8a57-a353444de9dc-kube-api-access-x59lh\") pod \"redhat-operators-nwlr8\" (UID: \"68c8b80a-0af0-46cb-8a57-a353444de9dc\") " pod="openshift-marketplace/redhat-operators-nwlr8" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.684290 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.732298 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2"] Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.782594 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c8b80a-0af0-46cb-8a57-a353444de9dc-catalog-content\") pod \"redhat-operators-nwlr8\" (UID: \"68c8b80a-0af0-46cb-8a57-a353444de9dc\") " pod="openshift-marketplace/redhat-operators-nwlr8" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.782729 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x59lh\" (UniqueName: \"kubernetes.io/projected/68c8b80a-0af0-46cb-8a57-a353444de9dc-kube-api-access-x59lh\") pod \"redhat-operators-nwlr8\" (UID: \"68c8b80a-0af0-46cb-8a57-a353444de9dc\") " pod="openshift-marketplace/redhat-operators-nwlr8" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.782767 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.782876 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c8b80a-0af0-46cb-8a57-a353444de9dc-utilities\") pod \"redhat-operators-nwlr8\" (UID: \"68c8b80a-0af0-46cb-8a57-a353444de9dc\") " pod="openshift-marketplace/redhat-operators-nwlr8" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.783573 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c8b80a-0af0-46cb-8a57-a353444de9dc-utilities\") pod \"redhat-operators-nwlr8\" (UID: \"68c8b80a-0af0-46cb-8a57-a353444de9dc\") " pod="openshift-marketplace/redhat-operators-nwlr8" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.783614 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c8b80a-0af0-46cb-8a57-a353444de9dc-catalog-content\") pod \"redhat-operators-nwlr8\" (UID: \"68c8b80a-0af0-46cb-8a57-a353444de9dc\") " pod="openshift-marketplace/redhat-operators-nwlr8" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.786841 4906 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.786899 4906 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.805060 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x59lh\" (UniqueName: \"kubernetes.io/projected/68c8b80a-0af0-46cb-8a57-a353444de9dc-kube-api-access-x59lh\") pod \"redhat-operators-nwlr8\" (UID: \"68c8b80a-0af0-46cb-8a57-a353444de9dc\") " pod="openshift-marketplace/redhat-operators-nwlr8" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.813310 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-js2xl"] Mar 10 00:09:18 crc kubenswrapper[4906]: W0310 00:09:18.854893 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a68ed18_11e8_4943_854e_a8e4a5566313.slice/crio-85b6d97638460e38f7f947eb9e3112739f9710544346fb5ae9fe79d346b0c8f3 WatchSource:0}: Error finding container 85b6d97638460e38f7f947eb9e3112739f9710544346fb5ae9fe79d346b0c8f3: Status 404 returned error can't find the container with id 85b6d97638460e38f7f947eb9e3112739f9710544346fb5ae9fe79d346b0c8f3 Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.867769 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66jxp\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.926488 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.932726 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.937369 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.938157 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.940272 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.966032 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwlr8" Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.973350 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m5t8r"] Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.979033 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m5t8r"] Mar 10 00:09:18 crc kubenswrapper[4906]: I0310 00:09:18.979231 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5t8r" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.012499 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.019345 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.033306 4906 ???:1] "http: TLS handshake error from 192.168.126.11:39676: no serving certificate available for the kubelet" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.091236 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947e7159-64b1-413f-8cee-daea0a8d0f3e-catalog-content\") pod \"redhat-operators-m5t8r\" (UID: \"947e7159-64b1-413f-8cee-daea0a8d0f3e\") " pod="openshift-marketplace/redhat-operators-m5t8r" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.091643 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947e7159-64b1-413f-8cee-daea0a8d0f3e-utilities\") pod \"redhat-operators-m5t8r\" (UID: \"947e7159-64b1-413f-8cee-daea0a8d0f3e\") " pod="openshift-marketplace/redhat-operators-m5t8r" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.091680 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eae092ed-769c-484d-b71c-cac11de3899c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"eae092ed-769c-484d-b71c-cac11de3899c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.091698 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eae092ed-769c-484d-b71c-cac11de3899c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"eae092ed-769c-484d-b71c-cac11de3899c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.091724 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm68j\" (UniqueName: \"kubernetes.io/projected/947e7159-64b1-413f-8cee-daea0a8d0f3e-kube-api-access-pm68j\") pod \"redhat-operators-m5t8r\" (UID: \"947e7159-64b1-413f-8cee-daea0a8d0f3e\") " pod="openshift-marketplace/redhat-operators-m5t8r" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.192855 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947e7159-64b1-413f-8cee-daea0a8d0f3e-utilities\") pod \"redhat-operators-m5t8r\" (UID: \"947e7159-64b1-413f-8cee-daea0a8d0f3e\") " pod="openshift-marketplace/redhat-operators-m5t8r" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.192925 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eae092ed-769c-484d-b71c-cac11de3899c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"eae092ed-769c-484d-b71c-cac11de3899c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.192948 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eae092ed-769c-484d-b71c-cac11de3899c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"eae092ed-769c-484d-b71c-cac11de3899c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.192980 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm68j\" (UniqueName: \"kubernetes.io/projected/947e7159-64b1-413f-8cee-daea0a8d0f3e-kube-api-access-pm68j\") pod \"redhat-operators-m5t8r\" (UID: \"947e7159-64b1-413f-8cee-daea0a8d0f3e\") " pod="openshift-marketplace/redhat-operators-m5t8r" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.193042 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947e7159-64b1-413f-8cee-daea0a8d0f3e-catalog-content\") pod \"redhat-operators-m5t8r\" (UID: \"947e7159-64b1-413f-8cee-daea0a8d0f3e\") " pod="openshift-marketplace/redhat-operators-m5t8r" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.193523 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947e7159-64b1-413f-8cee-daea0a8d0f3e-utilities\") pod \"redhat-operators-m5t8r\" (UID: \"947e7159-64b1-413f-8cee-daea0a8d0f3e\") " pod="openshift-marketplace/redhat-operators-m5t8r" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.193567 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947e7159-64b1-413f-8cee-daea0a8d0f3e-catalog-content\") pod \"redhat-operators-m5t8r\" (UID: \"947e7159-64b1-413f-8cee-daea0a8d0f3e\") " pod="openshift-marketplace/redhat-operators-m5t8r" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.194001 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eae092ed-769c-484d-b71c-cac11de3899c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"eae092ed-769c-484d-b71c-cac11de3899c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.218806 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eae092ed-769c-484d-b71c-cac11de3899c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"eae092ed-769c-484d-b71c-cac11de3899c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.223226 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm68j\" (UniqueName: \"kubernetes.io/projected/947e7159-64b1-413f-8cee-daea0a8d0f3e-kube-api-access-pm68j\") pod \"redhat-operators-m5t8r\" (UID: \"947e7159-64b1-413f-8cee-daea0a8d0f3e\") " pod="openshift-marketplace/redhat-operators-m5t8r" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.239829 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nwlr8"] Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.241964 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" event={"ID":"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb","Type":"ContainerStarted","Data":"ab5f637f21339e31d23ab29cd57de710bd7986c32d09e61f53e45eae0b263ef4"} Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.242002 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" event={"ID":"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb","Type":"ContainerStarted","Data":"8dfb87f92d622b4cc5787f22dbe851a265e1cdcd637a5963ec9213f417776201"} Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.249009 4906 generic.go:334] "Generic (PLEG): container finished" podID="0492b9b7-f88e-48ee-88e6-83aa55d8ce65" containerID="64597f622ff2f5792f882b1527eab2d6f3437f35784c901b087a19a1798ee983" exitCode=0 Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.249063 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" event={"ID":"0492b9b7-f88e-48ee-88e6-83aa55d8ce65","Type":"ContainerDied","Data":"64597f622ff2f5792f882b1527eab2d6f3437f35784c901b087a19a1798ee983"} Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.252693 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5bn7b" event={"ID":"55d944ce-605e-41a7-9211-a5bc388145f1","Type":"ContainerStarted","Data":"10fa4f40d4a7a470b62bb8d34627c28b77993482d24ca04f103fc94b2118adad"} Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.252766 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5bn7b" event={"ID":"55d944ce-605e-41a7-9211-a5bc388145f1","Type":"ContainerStarted","Data":"598957f4afb1692fee861808789efccb86dce707e7eab23564cb8718d3e43976"} Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.260002 4906 generic.go:334] "Generic (PLEG): container finished" podID="f1beb2f4-c1c5-488d-8c76-bed30174a0de" containerID="459c36aaee677af4ae9fca1a1132c151871bf6706b661310394ba474cb5bb9af" exitCode=0 Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.260114 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whmcg" event={"ID":"f1beb2f4-c1c5-488d-8c76-bed30174a0de","Type":"ContainerDied","Data":"459c36aaee677af4ae9fca1a1132c151871bf6706b661310394ba474cb5bb9af"} Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.260161 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whmcg" event={"ID":"f1beb2f4-c1c5-488d-8c76-bed30174a0de","Type":"ContainerStarted","Data":"ffe09a6914c2f34259169dbc5c27f70c5da0f786e780406e5c05938c5a78da4e"} Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.266024 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.266004 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v25gg" event={"ID":"85226c05-ab07-4243-89af-c58b7c3d1f43","Type":"ContainerDied","Data":"8842ce2138db810d473bfd654d346fe7eefa4bd83570543528fc8359ff74fe0a"} Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.266723 4906 scope.go:117] "RemoveContainer" containerID="a18807da7208beee65e915c4c8e80b88ace1d81d39610d4f7625ca5af530b46f" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.274200 4906 generic.go:334] "Generic (PLEG): container finished" podID="dc4d2e8f-54ca-464b-b186-432747b22864" containerID="eb3bfe73e4d142b168b2c628fb012ce231de3bc959b2e3e6f29d940caf292556" exitCode=0 Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.274247 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvvhx" event={"ID":"dc4d2e8f-54ca-464b-b186-432747b22864","Type":"ContainerDied","Data":"eb3bfe73e4d142b168b2c628fb012ce231de3bc959b2e3e6f29d940caf292556"} Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.279519 4906 generic.go:334] "Generic (PLEG): container finished" podID="bfd0c098-c58b-456c-a9b2-270e749bc274" containerID="0b72a984e6ae9d267a0f0503a195eafcc0e1971c0a2345fe1abd9448f7f97767" exitCode=0 Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.279688 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbwl2" event={"ID":"bfd0c098-c58b-456c-a9b2-270e749bc274","Type":"ContainerDied","Data":"0b72a984e6ae9d267a0f0503a195eafcc0e1971c0a2345fe1abd9448f7f97767"} Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.290415 4906 generic.go:334] "Generic (PLEG): container finished" podID="9a68ed18-11e8-4943-854e-a8e4a5566313" containerID="da54625627bec5526c7f69c77098a77afc9ad6db23f31a857a973002e5e69107" exitCode=0 Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.290511 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-js2xl" event={"ID":"9a68ed18-11e8-4943-854e-a8e4a5566313","Type":"ContainerDied","Data":"da54625627bec5526c7f69c77098a77afc9ad6db23f31a857a973002e5e69107"} Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.290545 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-js2xl" event={"ID":"9a68ed18-11e8-4943-854e-a8e4a5566313","Type":"ContainerStarted","Data":"85b6d97638460e38f7f947eb9e3112739f9710544346fb5ae9fe79d346b0c8f3"} Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.291361 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.296378 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" event={"ID":"752d09a7-ead6-439b-a35c-6abc6a00afdb","Type":"ContainerStarted","Data":"00a11ce27fa2651dc095569a4f50523c2669f4f81a75b2278c676328403f6e12"} Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.296467 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" event={"ID":"752d09a7-ead6-439b-a35c-6abc6a00afdb","Type":"ContainerStarted","Data":"64c7919f6a6849c42bc4401949ecd8339502b26370596b9301311543272cbd0c"} Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.296943 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.322840 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v25gg"] Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.323191 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5t8r" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.327135 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v25gg"] Mar 10 00:09:19 crc kubenswrapper[4906]: W0310 00:09:19.342140 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68c8b80a_0af0_46cb_8a57_a353444de9dc.slice/crio-d0c6d459d953c52e76f1e79e20d47afe31f745624dd8022ad25146bd1500b707 WatchSource:0}: Error finding container d0c6d459d953c52e76f1e79e20d47afe31f745624dd8022ad25146bd1500b707: Status 404 returned error can't find the container with id d0c6d459d953c52e76f1e79e20d47afe31f745624dd8022ad25146bd1500b707 Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.365558 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.378145 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" podStartSLOduration=3.378128177 podStartE2EDuration="3.378128177s" podCreationTimestamp="2026-03-10 00:09:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:19.376385558 +0000 UTC m=+185.524280670" watchObservedRunningTime="2026-03-10 00:09:19.378128177 +0000 UTC m=+185.526023289" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.579408 4906 patch_prober.go:28] interesting pod/router-default-5444994796-kpmwl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:19 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Mar 10 00:09:19 crc kubenswrapper[4906]: [+]process-running ok Mar 10 00:09:19 crc kubenswrapper[4906]: healthz check failed Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.580005 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kpmwl" podUID="6bbf8e83-d583-49de-a12c-6f0a3953dc67" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.592438 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-66jxp"] Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.801265 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.833932 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m5t8r"] Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.907555 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.907612 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.917411 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:19 crc kubenswrapper[4906]: W0310 00:09:19.943453 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod947e7159_64b1_413f_8cee_daea0a8d0f3e.slice/crio-11741743aaf42428c752a9681edeb285f380fce0e15722827ea4e2a47cc76a2f WatchSource:0}: Error finding container 11741743aaf42428c752a9681edeb285f380fce0e15722827ea4e2a47cc76a2f: Status 404 returned error can't find the container with id 11741743aaf42428c752a9681edeb285f380fce0e15722827ea4e2a47cc76a2f Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.991987 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:19 crc kubenswrapper[4906]: I0310 00:09:19.992028 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.037319 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.114990 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74876fd9-cdhtz"] Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.116901 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.123369 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.123686 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.123870 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.125525 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.127751 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.128189 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.134281 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.138986 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74876fd9-cdhtz"] Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.192160 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.192219 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.205384 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.207399 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.217271 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.217862 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.221974 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.223277 4906 patch_prober.go:28] interesting pod/console-f9d7485db-q9zx6 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.225421 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-q9zx6" podUID="39ebc592-086a-43e6-87d6-c2d67607d511" containerName="console" probeResult="failure" output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.227706 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-proxy-ca-bundles\") pod \"controller-manager-74876fd9-cdhtz\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.227867 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-config\") pod \"controller-manager-74876fd9-cdhtz\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.227972 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74zhk\" (UniqueName: \"kubernetes.io/projected/e695b953-e51d-40d2-aed6-732acb8abff2-kube-api-access-74zhk\") pod \"controller-manager-74876fd9-cdhtz\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.228087 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e695b953-e51d-40d2-aed6-732acb8abff2-serving-cert\") pod \"controller-manager-74876fd9-cdhtz\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.228121 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-client-ca\") pod \"controller-manager-74876fd9-cdhtz\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.311904 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" event={"ID":"47ee6fa1-0ef0-414f-91af-0f170e94c390","Type":"ContainerStarted","Data":"10aee4e6e619472c94b77cc63290d24e555fb483641c7dcd4c04a02eb2112dd6"} Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.311957 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" event={"ID":"47ee6fa1-0ef0-414f-91af-0f170e94c390","Type":"ContainerStarted","Data":"f3e975c0539acc30f2d16ea72bcdba40ba957693aae0b95d8dab10c83f2fbdaa"} Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.312066 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.314099 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"eae092ed-769c-484d-b71c-cac11de3899c","Type":"ContainerStarted","Data":"68f9c5d6710a892fd202649d54d5a290fd59824759c382ca1bce7bd21d0658a9"} Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.316756 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5t8r" event={"ID":"947e7159-64b1-413f-8cee-daea0a8d0f3e","Type":"ContainerStarted","Data":"11741743aaf42428c752a9681edeb285f380fce0e15722827ea4e2a47cc76a2f"} Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.321102 4906 generic.go:334] "Generic (PLEG): container finished" podID="68c8b80a-0af0-46cb-8a57-a353444de9dc" containerID="4c75d0ad3e643818b0755e7d1c3f1580b0dcb16b9627e9b8cf07dd1af7d05a29" exitCode=0 Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.322043 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwlr8" event={"ID":"68c8b80a-0af0-46cb-8a57-a353444de9dc","Type":"ContainerDied","Data":"4c75d0ad3e643818b0755e7d1c3f1580b0dcb16b9627e9b8cf07dd1af7d05a29"} Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.322072 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwlr8" event={"ID":"68c8b80a-0af0-46cb-8a57-a353444de9dc","Type":"ContainerStarted","Data":"d0c6d459d953c52e76f1e79e20d47afe31f745624dd8022ad25146bd1500b707"} Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.329136 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-proxy-ca-bundles\") pod \"controller-manager-74876fd9-cdhtz\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.329186 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-config\") pod \"controller-manager-74876fd9-cdhtz\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.329214 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33df4e47-60ec-4325-8d32-d9d3a66534da-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"33df4e47-60ec-4325-8d32-d9d3a66534da\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.329238 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33df4e47-60ec-4325-8d32-d9d3a66534da-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"33df4e47-60ec-4325-8d32-d9d3a66534da\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.329351 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74zhk\" (UniqueName: \"kubernetes.io/projected/e695b953-e51d-40d2-aed6-732acb8abff2-kube-api-access-74zhk\") pod \"controller-manager-74876fd9-cdhtz\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.329569 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e695b953-e51d-40d2-aed6-732acb8abff2-serving-cert\") pod \"controller-manager-74876fd9-cdhtz\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.329669 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-client-ca\") pod \"controller-manager-74876fd9-cdhtz\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.330628 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-proxy-ca-bundles\") pod \"controller-manager-74876fd9-cdhtz\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.332046 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-client-ca\") pod \"controller-manager-74876fd9-cdhtz\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.334611 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" event={"ID":"d49db0cb-de7e-4194-9f6e-8b58cf5f98fb","Type":"ContainerStarted","Data":"c46f5de62ac7ef5db9e38908a9aa71519e575fdc3acb156837f386ba2b8aa464"} Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.335087 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-config\") pod \"controller-manager-74876fd9-cdhtz\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.339004 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" podStartSLOduration=131.338975274 podStartE2EDuration="2m11.338975274s" podCreationTimestamp="2026-03-10 00:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:20.338887952 +0000 UTC m=+186.486783064" watchObservedRunningTime="2026-03-10 00:09:20.338975274 +0000 UTC m=+186.486870386" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.352051 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e695b953-e51d-40d2-aed6-732acb8abff2-serving-cert\") pod \"controller-manager-74876fd9-cdhtz\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.352089 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74zhk\" (UniqueName: \"kubernetes.io/projected/e695b953-e51d-40d2-aed6-732acb8abff2-kube-api-access-74zhk\") pod \"controller-manager-74876fd9-cdhtz\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.368626 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5bn7b" event={"ID":"55d944ce-605e-41a7-9211-a5bc388145f1","Type":"ContainerStarted","Data":"608bdf3670afe8c4a0367eb6bfc23ae3690d63c243d948aca64cdc5403580ae5"} Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.373948 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-t5bxf" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.375603 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jws8x" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.378524 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-x6g9x" podStartSLOduration=13.378513222 podStartE2EDuration="13.378513222s" podCreationTimestamp="2026-03-10 00:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:20.37774505 +0000 UTC m=+186.525640162" watchObservedRunningTime="2026-03-10 00:09:20.378513222 +0000 UTC m=+186.526408334" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.431570 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33df4e47-60ec-4325-8d32-d9d3a66534da-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"33df4e47-60ec-4325-8d32-d9d3a66534da\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.431663 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33df4e47-60ec-4325-8d32-d9d3a66534da-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"33df4e47-60ec-4325-8d32-d9d3a66534da\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.436031 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33df4e47-60ec-4325-8d32-d9d3a66534da-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"33df4e47-60ec-4325-8d32-d9d3a66534da\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.451971 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5bn7b" podStartSLOduration=132.451949278 podStartE2EDuration="2m12.451949278s" podCreationTimestamp="2026-03-10 00:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:20.449222461 +0000 UTC m=+186.597117573" watchObservedRunningTime="2026-03-10 00:09:20.451949278 +0000 UTC m=+186.599844390" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.484062 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33df4e47-60ec-4325-8d32-d9d3a66534da-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"33df4e47-60ec-4325-8d32-d9d3a66534da\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.504510 4906 patch_prober.go:28] interesting pod/downloads-7954f5f757-zkqrr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.504574 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zkqrr" podUID="1c418134-7dc2-42f5-b1ce-c22a2b77a4b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.504701 4906 patch_prober.go:28] interesting pod/downloads-7954f5f757-zkqrr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.504788 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zkqrr" podUID="1c418134-7dc2-42f5-b1ce-c22a2b77a4b9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.538392 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.559237 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.571496 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.596426 4906 patch_prober.go:28] interesting pod/router-default-5444994796-kpmwl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:20 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Mar 10 00:09:20 crc kubenswrapper[4906]: [+]process-running ok Mar 10 00:09:20 crc kubenswrapper[4906]: healthz check failed Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.596536 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kpmwl" podUID="6bbf8e83-d583-49de-a12c-6f0a3953dc67" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.641064 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85226c05-ab07-4243-89af-c58b7c3d1f43" path="/var/lib/kubelet/pods/85226c05-ab07-4243-89af-c58b7c3d1f43/volumes" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.642291 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 10 00:09:20 crc kubenswrapper[4906]: I0310 00:09:20.964015 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74876fd9-cdhtz"] Mar 10 00:09:21 crc kubenswrapper[4906]: W0310 00:09:21.008467 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode695b953_e51d_40d2_aed6_732acb8abff2.slice/crio-104e7d53a7daf5c0197aee44e1300024e3eb200c9c3dec8621b604844d6ccce0 WatchSource:0}: Error finding container 104e7d53a7daf5c0197aee44e1300024e3eb200c9c3dec8621b604844d6ccce0: Status 404 returned error can't find the container with id 104e7d53a7daf5c0197aee44e1300024e3eb200c9c3dec8621b604844d6ccce0 Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.227244 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.227686 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.331493 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.358679 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-647km\" (UniqueName: \"kubernetes.io/projected/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-kube-api-access-647km\") pod \"0492b9b7-f88e-48ee-88e6-83aa55d8ce65\" (UID: \"0492b9b7-f88e-48ee-88e6-83aa55d8ce65\") " Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.358759 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-config-volume\") pod \"0492b9b7-f88e-48ee-88e6-83aa55d8ce65\" (UID: \"0492b9b7-f88e-48ee-88e6-83aa55d8ce65\") " Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.358799 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-secret-volume\") pod \"0492b9b7-f88e-48ee-88e6-83aa55d8ce65\" (UID: \"0492b9b7-f88e-48ee-88e6-83aa55d8ce65\") " Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.367373 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-config-volume" (OuterVolumeSpecName: "config-volume") pod "0492b9b7-f88e-48ee-88e6-83aa55d8ce65" (UID: "0492b9b7-f88e-48ee-88e6-83aa55d8ce65"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.368468 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0492b9b7-f88e-48ee-88e6-83aa55d8ce65" (UID: "0492b9b7-f88e-48ee-88e6-83aa55d8ce65"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.402928 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-kube-api-access-647km" (OuterVolumeSpecName: "kube-api-access-647km") pod "0492b9b7-f88e-48ee-88e6-83aa55d8ce65" (UID: "0492b9b7-f88e-48ee-88e6-83aa55d8ce65"). InnerVolumeSpecName "kube-api-access-647km". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.451041 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" event={"ID":"e695b953-e51d-40d2-aed6-732acb8abff2","Type":"ContainerStarted","Data":"104e7d53a7daf5c0197aee44e1300024e3eb200c9c3dec8621b604844d6ccce0"} Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.460766 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-647km\" (UniqueName: \"kubernetes.io/projected/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-kube-api-access-647km\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.460804 4906 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.460814 4906 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0492b9b7-f88e-48ee-88e6-83aa55d8ce65-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.467004 4906 generic.go:334] "Generic (PLEG): container finished" podID="eae092ed-769c-484d-b71c-cac11de3899c" containerID="c0b32df1c1d2b0c68c814ce8c3aeb4b4e76050bdf31021d9c1266406e5fc8db9" exitCode=0 Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.467545 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"eae092ed-769c-484d-b71c-cac11de3899c","Type":"ContainerDied","Data":"c0b32df1c1d2b0c68c814ce8c3aeb4b4e76050bdf31021d9c1266406e5fc8db9"} Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.474924 4906 generic.go:334] "Generic (PLEG): container finished" podID="947e7159-64b1-413f-8cee-daea0a8d0f3e" containerID="e861630965e8a71e53ab2e340a22c2329aaa7f93a29f8c2ff77e420b64a9f9fc" exitCode=0 Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.475059 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5t8r" event={"ID":"947e7159-64b1-413f-8cee-daea0a8d0f3e","Type":"ContainerDied","Data":"e861630965e8a71e53ab2e340a22c2329aaa7f93a29f8c2ff77e420b64a9f9fc"} Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.524950 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" event={"ID":"0492b9b7-f88e-48ee-88e6-83aa55d8ce65","Type":"ContainerDied","Data":"482af40c76464fdb90b58eb971231d8b8032366a2a8f9f9fdc35f6db012db761"} Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.525003 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="482af40c76464fdb90b58eb971231d8b8032366a2a8f9f9fdc35f6db012db761" Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.525394 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551680-7lx2x" Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.575064 4906 patch_prober.go:28] interesting pod/router-default-5444994796-kpmwl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:21 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Mar 10 00:09:21 crc kubenswrapper[4906]: [+]process-running ok Mar 10 00:09:21 crc kubenswrapper[4906]: healthz check failed Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.575963 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kpmwl" podUID="6bbf8e83-d583-49de-a12c-6f0a3953dc67" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:21 crc kubenswrapper[4906]: I0310 00:09:21.620150 4906 ???:1] "http: TLS handshake error from 192.168.126.11:55962: no serving certificate available for the kubelet" Mar 10 00:09:22 crc kubenswrapper[4906]: I0310 00:09:22.544752 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"33df4e47-60ec-4325-8d32-d9d3a66534da","Type":"ContainerStarted","Data":"89df8f665e6383dd129724217463ee770cad52bf113ff747e1aefb485fbabf1e"} Mar 10 00:09:22 crc kubenswrapper[4906]: I0310 00:09:22.545168 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"33df4e47-60ec-4325-8d32-d9d3a66534da","Type":"ContainerStarted","Data":"3fe26123e34b321301e1309d94a1bc601bffbeaef3e405b7449be8d794329236"} Mar 10 00:09:22 crc kubenswrapper[4906]: I0310 00:09:22.548959 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" event={"ID":"e695b953-e51d-40d2-aed6-732acb8abff2","Type":"ContainerStarted","Data":"b979a1830fcf32d8889cb2d098d8f8254e8312e6d9cb5d1ed63177a65cb37ecc"} Mar 10 00:09:22 crc kubenswrapper[4906]: I0310 00:09:22.559714 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.559694352 podStartE2EDuration="2.559694352s" podCreationTimestamp="2026-03-10 00:09:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:22.559098706 +0000 UTC m=+188.706993828" watchObservedRunningTime="2026-03-10 00:09:22.559694352 +0000 UTC m=+188.707589464" Mar 10 00:09:22 crc kubenswrapper[4906]: I0310 00:09:22.561076 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:22 crc kubenswrapper[4906]: I0310 00:09:22.573986 4906 patch_prober.go:28] interesting pod/router-default-5444994796-kpmwl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:22 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Mar 10 00:09:22 crc kubenswrapper[4906]: [+]process-running ok Mar 10 00:09:22 crc kubenswrapper[4906]: healthz check failed Mar 10 00:09:22 crc kubenswrapper[4906]: I0310 00:09:22.574053 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kpmwl" podUID="6bbf8e83-d583-49de-a12c-6f0a3953dc67" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:22 crc kubenswrapper[4906]: I0310 00:09:22.583985 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" podStartSLOduration=6.583965549 podStartE2EDuration="6.583965549s" podCreationTimestamp="2026-03-10 00:09:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:22.58365536 +0000 UTC m=+188.731550472" watchObservedRunningTime="2026-03-10 00:09:22.583965549 +0000 UTC m=+188.731860661" Mar 10 00:09:22 crc kubenswrapper[4906]: I0310 00:09:22.604664 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:23 crc kubenswrapper[4906]: I0310 00:09:23.250099 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:23 crc kubenswrapper[4906]: I0310 00:09:23.310117 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eae092ed-769c-484d-b71c-cac11de3899c-kubelet-dir\") pod \"eae092ed-769c-484d-b71c-cac11de3899c\" (UID: \"eae092ed-769c-484d-b71c-cac11de3899c\") " Mar 10 00:09:23 crc kubenswrapper[4906]: I0310 00:09:23.310206 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eae092ed-769c-484d-b71c-cac11de3899c-kube-api-access\") pod \"eae092ed-769c-484d-b71c-cac11de3899c\" (UID: \"eae092ed-769c-484d-b71c-cac11de3899c\") " Mar 10 00:09:23 crc kubenswrapper[4906]: I0310 00:09:23.311470 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eae092ed-769c-484d-b71c-cac11de3899c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "eae092ed-769c-484d-b71c-cac11de3899c" (UID: "eae092ed-769c-484d-b71c-cac11de3899c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:09:23 crc kubenswrapper[4906]: I0310 00:09:23.324731 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eae092ed-769c-484d-b71c-cac11de3899c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "eae092ed-769c-484d-b71c-cac11de3899c" (UID: "eae092ed-769c-484d-b71c-cac11de3899c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:09:23 crc kubenswrapper[4906]: I0310 00:09:23.411477 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eae092ed-769c-484d-b71c-cac11de3899c-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:23 crc kubenswrapper[4906]: I0310 00:09:23.411512 4906 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eae092ed-769c-484d-b71c-cac11de3899c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:23 crc kubenswrapper[4906]: I0310 00:09:23.568561 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"eae092ed-769c-484d-b71c-cac11de3899c","Type":"ContainerDied","Data":"68f9c5d6710a892fd202649d54d5a290fd59824759c382ca1bce7bd21d0658a9"} Mar 10 00:09:23 crc kubenswrapper[4906]: I0310 00:09:23.568608 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68f9c5d6710a892fd202649d54d5a290fd59824759c382ca1bce7bd21d0658a9" Mar 10 00:09:23 crc kubenswrapper[4906]: I0310 00:09:23.568730 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 00:09:23 crc kubenswrapper[4906]: I0310 00:09:23.578078 4906 patch_prober.go:28] interesting pod/router-default-5444994796-kpmwl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:23 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Mar 10 00:09:23 crc kubenswrapper[4906]: [+]process-running ok Mar 10 00:09:23 crc kubenswrapper[4906]: healthz check failed Mar 10 00:09:23 crc kubenswrapper[4906]: I0310 00:09:23.578122 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kpmwl" podUID="6bbf8e83-d583-49de-a12c-6f0a3953dc67" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:23 crc kubenswrapper[4906]: I0310 00:09:23.590712 4906 generic.go:334] "Generic (PLEG): container finished" podID="33df4e47-60ec-4325-8d32-d9d3a66534da" containerID="89df8f665e6383dd129724217463ee770cad52bf113ff747e1aefb485fbabf1e" exitCode=0 Mar 10 00:09:23 crc kubenswrapper[4906]: I0310 00:09:23.591824 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"33df4e47-60ec-4325-8d32-d9d3a66534da","Type":"ContainerDied","Data":"89df8f665e6383dd129724217463ee770cad52bf113ff747e1aefb485fbabf1e"} Mar 10 00:09:23 crc kubenswrapper[4906]: I0310 00:09:23.725703 4906 ???:1] "http: TLS handshake error from 192.168.126.11:55974: no serving certificate available for the kubelet" Mar 10 00:09:24 crc kubenswrapper[4906]: I0310 00:09:24.579503 4906 patch_prober.go:28] interesting pod/router-default-5444994796-kpmwl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:24 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Mar 10 00:09:24 crc kubenswrapper[4906]: [+]process-running ok Mar 10 00:09:24 crc kubenswrapper[4906]: healthz check failed Mar 10 00:09:24 crc kubenswrapper[4906]: I0310 00:09:24.579562 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kpmwl" podUID="6bbf8e83-d583-49de-a12c-6f0a3953dc67" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:24 crc kubenswrapper[4906]: I0310 00:09:24.902091 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:09:24 crc kubenswrapper[4906]: I0310 00:09:24.952976 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33df4e47-60ec-4325-8d32-d9d3a66534da-kubelet-dir\") pod \"33df4e47-60ec-4325-8d32-d9d3a66534da\" (UID: \"33df4e47-60ec-4325-8d32-d9d3a66534da\") " Mar 10 00:09:24 crc kubenswrapper[4906]: I0310 00:09:24.953055 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33df4e47-60ec-4325-8d32-d9d3a66534da-kube-api-access\") pod \"33df4e47-60ec-4325-8d32-d9d3a66534da\" (UID: \"33df4e47-60ec-4325-8d32-d9d3a66534da\") " Mar 10 00:09:24 crc kubenswrapper[4906]: I0310 00:09:24.953929 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33df4e47-60ec-4325-8d32-d9d3a66534da-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "33df4e47-60ec-4325-8d32-d9d3a66534da" (UID: "33df4e47-60ec-4325-8d32-d9d3a66534da"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:09:24 crc kubenswrapper[4906]: I0310 00:09:24.984425 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33df4e47-60ec-4325-8d32-d9d3a66534da-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "33df4e47-60ec-4325-8d32-d9d3a66534da" (UID: "33df4e47-60ec-4325-8d32-d9d3a66534da"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:09:25 crc kubenswrapper[4906]: I0310 00:09:25.054197 4906 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33df4e47-60ec-4325-8d32-d9d3a66534da-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:25 crc kubenswrapper[4906]: I0310 00:09:25.054234 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/33df4e47-60ec-4325-8d32-d9d3a66534da-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:25 crc kubenswrapper[4906]: I0310 00:09:25.574738 4906 patch_prober.go:28] interesting pod/router-default-5444994796-kpmwl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:25 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Mar 10 00:09:25 crc kubenswrapper[4906]: [+]process-running ok Mar 10 00:09:25 crc kubenswrapper[4906]: healthz check failed Mar 10 00:09:25 crc kubenswrapper[4906]: I0310 00:09:25.574837 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kpmwl" podUID="6bbf8e83-d583-49de-a12c-6f0a3953dc67" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:25 crc kubenswrapper[4906]: I0310 00:09:25.637655 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"33df4e47-60ec-4325-8d32-d9d3a66534da","Type":"ContainerDied","Data":"3fe26123e34b321301e1309d94a1bc601bffbeaef3e405b7449be8d794329236"} Mar 10 00:09:25 crc kubenswrapper[4906]: I0310 00:09:25.637703 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fe26123e34b321301e1309d94a1bc601bffbeaef3e405b7449be8d794329236" Mar 10 00:09:25 crc kubenswrapper[4906]: I0310 00:09:25.637770 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 00:09:26 crc kubenswrapper[4906]: I0310 00:09:26.361258 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-j4vpr" Mar 10 00:09:26 crc kubenswrapper[4906]: I0310 00:09:26.576229 4906 patch_prober.go:28] interesting pod/router-default-5444994796-kpmwl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:26 crc kubenswrapper[4906]: [-]has-synced failed: reason withheld Mar 10 00:09:26 crc kubenswrapper[4906]: [+]process-running ok Mar 10 00:09:26 crc kubenswrapper[4906]: healthz check failed Mar 10 00:09:26 crc kubenswrapper[4906]: I0310 00:09:26.576299 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kpmwl" podUID="6bbf8e83-d583-49de-a12c-6f0a3953dc67" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:26 crc kubenswrapper[4906]: I0310 00:09:26.771067 4906 ???:1] "http: TLS handshake error from 192.168.126.11:55988: no serving certificate available for the kubelet" Mar 10 00:09:27 crc kubenswrapper[4906]: I0310 00:09:27.574271 4906 patch_prober.go:28] interesting pod/router-default-5444994796-kpmwl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 00:09:27 crc kubenswrapper[4906]: [+]has-synced ok Mar 10 00:09:27 crc kubenswrapper[4906]: [+]process-running ok Mar 10 00:09:27 crc kubenswrapper[4906]: healthz check failed Mar 10 00:09:27 crc kubenswrapper[4906]: I0310 00:09:27.574356 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kpmwl" podUID="6bbf8e83-d583-49de-a12c-6f0a3953dc67" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 00:09:28 crc kubenswrapper[4906]: I0310 00:09:28.587333 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:28 crc kubenswrapper[4906]: I0310 00:09:28.591418 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-kpmwl" Mar 10 00:09:30 crc kubenswrapper[4906]: I0310 00:09:30.226629 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:30 crc kubenswrapper[4906]: I0310 00:09:30.234924 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-q9zx6" Mar 10 00:09:30 crc kubenswrapper[4906]: I0310 00:09:30.514384 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-zkqrr" Mar 10 00:09:32 crc kubenswrapper[4906]: I0310 00:09:32.579049 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:32 crc kubenswrapper[4906]: I0310 00:09:32.579589 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:32 crc kubenswrapper[4906]: I0310 00:09:32.579623 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:32 crc kubenswrapper[4906]: I0310 00:09:32.579685 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:32 crc kubenswrapper[4906]: I0310 00:09:32.581608 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 00:09:32 crc kubenswrapper[4906]: I0310 00:09:32.581742 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 00:09:32 crc kubenswrapper[4906]: I0310 00:09:32.583776 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 00:09:32 crc kubenswrapper[4906]: I0310 00:09:32.591618 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 00:09:32 crc kubenswrapper[4906]: I0310 00:09:32.597374 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:32 crc kubenswrapper[4906]: I0310 00:09:32.604739 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:32 crc kubenswrapper[4906]: I0310 00:09:32.606410 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:32 crc kubenswrapper[4906]: I0310 00:09:32.615908 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:32 crc kubenswrapper[4906]: I0310 00:09:32.721816 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 00:09:32 crc kubenswrapper[4906]: I0310 00:09:32.739907 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:32 crc kubenswrapper[4906]: I0310 00:09:32.750363 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 00:09:35 crc kubenswrapper[4906]: I0310 00:09:35.091816 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74876fd9-cdhtz"] Mar 10 00:09:35 crc kubenswrapper[4906]: I0310 00:09:35.093371 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" podUID="e695b953-e51d-40d2-aed6-732acb8abff2" containerName="controller-manager" containerID="cri-o://b979a1830fcf32d8889cb2d098d8f8254e8312e6d9cb5d1ed63177a65cb37ecc" gracePeriod=30 Mar 10 00:09:35 crc kubenswrapper[4906]: I0310 00:09:35.097840 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2"] Mar 10 00:09:35 crc kubenswrapper[4906]: I0310 00:09:35.098168 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" podUID="752d09a7-ead6-439b-a35c-6abc6a00afdb" containerName="route-controller-manager" containerID="cri-o://00a11ce27fa2651dc095569a4f50523c2669f4f81a75b2278c676328403f6e12" gracePeriod=30 Mar 10 00:09:35 crc kubenswrapper[4906]: I0310 00:09:35.753985 4906 generic.go:334] "Generic (PLEG): container finished" podID="752d09a7-ead6-439b-a35c-6abc6a00afdb" containerID="00a11ce27fa2651dc095569a4f50523c2669f4f81a75b2278c676328403f6e12" exitCode=0 Mar 10 00:09:35 crc kubenswrapper[4906]: I0310 00:09:35.754076 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" event={"ID":"752d09a7-ead6-439b-a35c-6abc6a00afdb","Type":"ContainerDied","Data":"00a11ce27fa2651dc095569a4f50523c2669f4f81a75b2278c676328403f6e12"} Mar 10 00:09:35 crc kubenswrapper[4906]: I0310 00:09:35.755865 4906 generic.go:334] "Generic (PLEG): container finished" podID="e695b953-e51d-40d2-aed6-732acb8abff2" containerID="b979a1830fcf32d8889cb2d098d8f8254e8312e6d9cb5d1ed63177a65cb37ecc" exitCode=0 Mar 10 00:09:35 crc kubenswrapper[4906]: I0310 00:09:35.755895 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" event={"ID":"e695b953-e51d-40d2-aed6-732acb8abff2","Type":"ContainerDied","Data":"b979a1830fcf32d8889cb2d098d8f8254e8312e6d9cb5d1ed63177a65cb37ecc"} Mar 10 00:09:38 crc kubenswrapper[4906]: I0310 00:09:38.343565 4906 patch_prober.go:28] interesting pod/route-controller-manager-f5fcb6c78-2jlb2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 10 00:09:38 crc kubenswrapper[4906]: I0310 00:09:38.343653 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" podUID="752d09a7-ead6-439b-a35c-6abc6a00afdb" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 10 00:09:39 crc kubenswrapper[4906]: I0310 00:09:39.026066 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:09:40 crc kubenswrapper[4906]: I0310 00:09:40.540368 4906 patch_prober.go:28] interesting pod/controller-manager-74876fd9-cdhtz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" start-of-body= Mar 10 00:09:40 crc kubenswrapper[4906]: I0310 00:09:40.540440 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" podUID="e695b953-e51d-40d2-aed6-732acb8abff2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" Mar 10 00:09:46 crc kubenswrapper[4906]: I0310 00:09:46.825145 4906 generic.go:334] "Generic (PLEG): container finished" podID="d91d282e-a4f3-4bc9-9623-4640774f641a" containerID="7e7178444a2023cfc4df700ef8b68a751212644659aac35229bd49430c1b1c77" exitCode=0 Mar 10 00:09:46 crc kubenswrapper[4906]: I0310 00:09:46.825229 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29551680-hclh4" event={"ID":"d91d282e-a4f3-4bc9-9623-4640774f641a","Type":"ContainerDied","Data":"7e7178444a2023cfc4df700ef8b68a751212644659aac35229bd49430c1b1c77"} Mar 10 00:09:47 crc kubenswrapper[4906]: E0310 00:09:47.140648 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 10 00:09:47 crc kubenswrapper[4906]: E0310 00:09:47.141164 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x59lh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-nwlr8_openshift-marketplace(68c8b80a-0af0-46cb-8a57-a353444de9dc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:09:47 crc kubenswrapper[4906]: E0310 00:09:47.142315 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-nwlr8" podUID="68c8b80a-0af0-46cb-8a57-a353444de9dc" Mar 10 00:09:47 crc kubenswrapper[4906]: I0310 00:09:47.270218 4906 ???:1] "http: TLS handshake error from 192.168.126.11:52286: no serving certificate available for the kubelet" Mar 10 00:09:48 crc kubenswrapper[4906]: E0310 00:09:48.709610 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-nwlr8" podUID="68c8b80a-0af0-46cb-8a57-a353444de9dc" Mar 10 00:09:48 crc kubenswrapper[4906]: E0310 00:09:48.769428 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 00:09:48 crc kubenswrapper[4906]: E0310 00:09:48.769609 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gxrkm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-fhmhd_openshift-marketplace(52694cc4-226b-4bd5-a6c7-0ebd711926e2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:09:48 crc kubenswrapper[4906]: E0310 00:09:48.770831 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-fhmhd" podUID="52694cc4-226b-4bd5-a6c7-0ebd711926e2" Mar 10 00:09:49 crc kubenswrapper[4906]: I0310 00:09:49.348821 4906 patch_prober.go:28] interesting pod/route-controller-manager-f5fcb6c78-2jlb2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:09:49 crc kubenswrapper[4906]: I0310 00:09:49.349390 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" podUID="752d09a7-ead6-439b-a35c-6abc6a00afdb" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.267298 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ldnm9" Mar 10 00:09:50 crc kubenswrapper[4906]: E0310 00:09:50.639119 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-fhmhd" podUID="52694cc4-226b-4bd5-a6c7-0ebd711926e2" Mar 10 00:09:50 crc kubenswrapper[4906]: E0310 00:09:50.715061 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 10 00:09:50 crc kubenswrapper[4906]: E0310 00:09:50.715250 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j4f97,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-nvvhx_openshift-marketplace(dc4d2e8f-54ca-464b-b186-432747b22864): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:09:50 crc kubenswrapper[4906]: E0310 00:09:50.716452 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-nvvhx" podUID="dc4d2e8f-54ca-464b-b186-432747b22864" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.735271 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.748600 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.762987 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-664f87d5d8-7ktrx"] Mar 10 00:09:50 crc kubenswrapper[4906]: E0310 00:09:50.763380 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0492b9b7-f88e-48ee-88e6-83aa55d8ce65" containerName="collect-profiles" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.763396 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="0492b9b7-f88e-48ee-88e6-83aa55d8ce65" containerName="collect-profiles" Mar 10 00:09:50 crc kubenswrapper[4906]: E0310 00:09:50.763424 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae092ed-769c-484d-b71c-cac11de3899c" containerName="pruner" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.763459 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae092ed-769c-484d-b71c-cac11de3899c" containerName="pruner" Mar 10 00:09:50 crc kubenswrapper[4906]: E0310 00:09:50.763483 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752d09a7-ead6-439b-a35c-6abc6a00afdb" containerName="route-controller-manager" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.763489 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="752d09a7-ead6-439b-a35c-6abc6a00afdb" containerName="route-controller-manager" Mar 10 00:09:50 crc kubenswrapper[4906]: E0310 00:09:50.763502 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33df4e47-60ec-4325-8d32-d9d3a66534da" containerName="pruner" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.763550 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="33df4e47-60ec-4325-8d32-d9d3a66534da" containerName="pruner" Mar 10 00:09:50 crc kubenswrapper[4906]: E0310 00:09:50.763561 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e695b953-e51d-40d2-aed6-732acb8abff2" containerName="controller-manager" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.763647 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="e695b953-e51d-40d2-aed6-732acb8abff2" containerName="controller-manager" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.763905 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="e695b953-e51d-40d2-aed6-732acb8abff2" containerName="controller-manager" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.763927 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae092ed-769c-484d-b71c-cac11de3899c" containerName="pruner" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.763981 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="33df4e47-60ec-4325-8d32-d9d3a66534da" containerName="pruner" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.763990 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="0492b9b7-f88e-48ee-88e6-83aa55d8ce65" containerName="collect-profiles" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.764000 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="752d09a7-ead6-439b-a35c-6abc6a00afdb" containerName="route-controller-manager" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.765123 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.779177 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-664f87d5d8-7ktrx"] Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.830104 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/752d09a7-ead6-439b-a35c-6abc6a00afdb-config\") pod \"752d09a7-ead6-439b-a35c-6abc6a00afdb\" (UID: \"752d09a7-ead6-439b-a35c-6abc6a00afdb\") " Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.830191 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74zhk\" (UniqueName: \"kubernetes.io/projected/e695b953-e51d-40d2-aed6-732acb8abff2-kube-api-access-74zhk\") pod \"e695b953-e51d-40d2-aed6-732acb8abff2\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.830244 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/752d09a7-ead6-439b-a35c-6abc6a00afdb-client-ca\") pod \"752d09a7-ead6-439b-a35c-6abc6a00afdb\" (UID: \"752d09a7-ead6-439b-a35c-6abc6a00afdb\") " Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.830284 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e695b953-e51d-40d2-aed6-732acb8abff2-serving-cert\") pod \"e695b953-e51d-40d2-aed6-732acb8abff2\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.830346 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/752d09a7-ead6-439b-a35c-6abc6a00afdb-serving-cert\") pod \"752d09a7-ead6-439b-a35c-6abc6a00afdb\" (UID: \"752d09a7-ead6-439b-a35c-6abc6a00afdb\") " Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.830380 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-client-ca\") pod \"e695b953-e51d-40d2-aed6-732acb8abff2\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.830407 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnwwc\" (UniqueName: \"kubernetes.io/projected/752d09a7-ead6-439b-a35c-6abc6a00afdb-kube-api-access-lnwwc\") pod \"752d09a7-ead6-439b-a35c-6abc6a00afdb\" (UID: \"752d09a7-ead6-439b-a35c-6abc6a00afdb\") " Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.830452 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-proxy-ca-bundles\") pod \"e695b953-e51d-40d2-aed6-732acb8abff2\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.830501 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-config\") pod \"e695b953-e51d-40d2-aed6-732acb8abff2\" (UID: \"e695b953-e51d-40d2-aed6-732acb8abff2\") " Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.831528 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/752d09a7-ead6-439b-a35c-6abc6a00afdb-client-ca" (OuterVolumeSpecName: "client-ca") pod "752d09a7-ead6-439b-a35c-6abc6a00afdb" (UID: "752d09a7-ead6-439b-a35c-6abc6a00afdb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.831690 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/752d09a7-ead6-439b-a35c-6abc6a00afdb-config" (OuterVolumeSpecName: "config") pod "752d09a7-ead6-439b-a35c-6abc6a00afdb" (UID: "752d09a7-ead6-439b-a35c-6abc6a00afdb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.832208 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-config" (OuterVolumeSpecName: "config") pod "e695b953-e51d-40d2-aed6-732acb8abff2" (UID: "e695b953-e51d-40d2-aed6-732acb8abff2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.832768 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e695b953-e51d-40d2-aed6-732acb8abff2" (UID: "e695b953-e51d-40d2-aed6-732acb8abff2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.833224 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-client-ca" (OuterVolumeSpecName: "client-ca") pod "e695b953-e51d-40d2-aed6-732acb8abff2" (UID: "e695b953-e51d-40d2-aed6-732acb8abff2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.841601 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e695b953-e51d-40d2-aed6-732acb8abff2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e695b953-e51d-40d2-aed6-732acb8abff2" (UID: "e695b953-e51d-40d2-aed6-732acb8abff2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.842052 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/752d09a7-ead6-439b-a35c-6abc6a00afdb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "752d09a7-ead6-439b-a35c-6abc6a00afdb" (UID: "752d09a7-ead6-439b-a35c-6abc6a00afdb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.849695 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" event={"ID":"e695b953-e51d-40d2-aed6-732acb8abff2","Type":"ContainerDied","Data":"104e7d53a7daf5c0197aee44e1300024e3eb200c9c3dec8621b604844d6ccce0"} Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.849746 4906 scope.go:117] "RemoveContainer" containerID="b979a1830fcf32d8889cb2d098d8f8254e8312e6d9cb5d1ed63177a65cb37ecc" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.849914 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e695b953-e51d-40d2-aed6-732acb8abff2-kube-api-access-74zhk" (OuterVolumeSpecName: "kube-api-access-74zhk") pod "e695b953-e51d-40d2-aed6-732acb8abff2" (UID: "e695b953-e51d-40d2-aed6-732acb8abff2"). InnerVolumeSpecName "kube-api-access-74zhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.849958 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.854761 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/752d09a7-ead6-439b-a35c-6abc6a00afdb-kube-api-access-lnwwc" (OuterVolumeSpecName: "kube-api-access-lnwwc") pod "752d09a7-ead6-439b-a35c-6abc6a00afdb" (UID: "752d09a7-ead6-439b-a35c-6abc6a00afdb"). InnerVolumeSpecName "kube-api-access-lnwwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.856448 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.858137 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2" event={"ID":"752d09a7-ead6-439b-a35c-6abc6a00afdb","Type":"ContainerDied","Data":"64c7919f6a6849c42bc4401949ecd8339502b26370596b9301311543272cbd0c"} Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.894230 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74876fd9-cdhtz"] Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.898830 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-74876fd9-cdhtz"] Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.908828 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2"] Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.912351 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5fcb6c78-2jlb2"] Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.932085 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-config\") pod \"controller-manager-664f87d5d8-7ktrx\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.932163 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e646eb29-cf82-4566-bdb2-6e65f915ad9b-serving-cert\") pod \"controller-manager-664f87d5d8-7ktrx\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.932325 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-proxy-ca-bundles\") pod \"controller-manager-664f87d5d8-7ktrx\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.932425 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-client-ca\") pod \"controller-manager-664f87d5d8-7ktrx\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.932467 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnlnd\" (UniqueName: \"kubernetes.io/projected/e646eb29-cf82-4566-bdb2-6e65f915ad9b-kube-api-access-vnlnd\") pod \"controller-manager-664f87d5d8-7ktrx\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.932558 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.932581 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/752d09a7-ead6-439b-a35c-6abc6a00afdb-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.932598 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74zhk\" (UniqueName: \"kubernetes.io/projected/e695b953-e51d-40d2-aed6-732acb8abff2-kube-api-access-74zhk\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.932610 4906 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/752d09a7-ead6-439b-a35c-6abc6a00afdb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.932622 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e695b953-e51d-40d2-aed6-732acb8abff2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.932645 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/752d09a7-ead6-439b-a35c-6abc6a00afdb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.932675 4906 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.932686 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnwwc\" (UniqueName: \"kubernetes.io/projected/752d09a7-ead6-439b-a35c-6abc6a00afdb-kube-api-access-lnwwc\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:50 crc kubenswrapper[4906]: I0310 00:09:50.932697 4906 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e695b953-e51d-40d2-aed6-732acb8abff2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:51 crc kubenswrapper[4906]: I0310 00:09:51.033339 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-proxy-ca-bundles\") pod \"controller-manager-664f87d5d8-7ktrx\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:51 crc kubenswrapper[4906]: I0310 00:09:51.033412 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-client-ca\") pod \"controller-manager-664f87d5d8-7ktrx\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:51 crc kubenswrapper[4906]: I0310 00:09:51.033441 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnlnd\" (UniqueName: \"kubernetes.io/projected/e646eb29-cf82-4566-bdb2-6e65f915ad9b-kube-api-access-vnlnd\") pod \"controller-manager-664f87d5d8-7ktrx\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:51 crc kubenswrapper[4906]: I0310 00:09:51.033488 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-config\") pod \"controller-manager-664f87d5d8-7ktrx\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:51 crc kubenswrapper[4906]: I0310 00:09:51.033528 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e646eb29-cf82-4566-bdb2-6e65f915ad9b-serving-cert\") pod \"controller-manager-664f87d5d8-7ktrx\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:51 crc kubenswrapper[4906]: I0310 00:09:51.034762 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-proxy-ca-bundles\") pod \"controller-manager-664f87d5d8-7ktrx\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:51 crc kubenswrapper[4906]: I0310 00:09:51.035383 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-config\") pod \"controller-manager-664f87d5d8-7ktrx\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:51 crc kubenswrapper[4906]: I0310 00:09:51.035397 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-client-ca\") pod \"controller-manager-664f87d5d8-7ktrx\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:51 crc kubenswrapper[4906]: I0310 00:09:51.037713 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e646eb29-cf82-4566-bdb2-6e65f915ad9b-serving-cert\") pod \"controller-manager-664f87d5d8-7ktrx\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:51 crc kubenswrapper[4906]: I0310 00:09:51.049388 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnlnd\" (UniqueName: \"kubernetes.io/projected/e646eb29-cf82-4566-bdb2-6e65f915ad9b-kube-api-access-vnlnd\") pod \"controller-manager-664f87d5d8-7ktrx\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:51 crc kubenswrapper[4906]: I0310 00:09:51.084404 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:51 crc kubenswrapper[4906]: I0310 00:09:51.540135 4906 patch_prober.go:28] interesting pod/controller-manager-74876fd9-cdhtz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:09:51 crc kubenswrapper[4906]: I0310 00:09:51.540239 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-74876fd9-cdhtz" podUID="e695b953-e51d-40d2-aed6-732acb8abff2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 00:09:51 crc kubenswrapper[4906]: E0310 00:09:51.680542 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-nvvhx" podUID="dc4d2e8f-54ca-464b-b186-432747b22864" Mar 10 00:09:51 crc kubenswrapper[4906]: E0310 00:09:51.715913 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 10 00:09:51 crc kubenswrapper[4906]: E0310 00:09:51.716083 4906 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 00:09:51 crc kubenswrapper[4906]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 10 00:09:51 crc kubenswrapper[4906]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2nbzf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29551688-fkkqj_openshift-infra(094c6270-b610-42c0-a6ce-3c146cb6bb6c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 10 00:09:51 crc kubenswrapper[4906]: > logger="UnhandledError" Mar 10 00:09:51 crc kubenswrapper[4906]: E0310 00:09:51.717427 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29551688-fkkqj" podUID="094c6270-b610-42c0-a6ce-3c146cb6bb6c" Mar 10 00:09:51 crc kubenswrapper[4906]: E0310 00:09:51.769584 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 10 00:09:51 crc kubenswrapper[4906]: E0310 00:09:51.769803 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d8576,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jbwl2_openshift-marketplace(bfd0c098-c58b-456c-a9b2-270e749bc274): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:09:51 crc kubenswrapper[4906]: E0310 00:09:51.770983 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-jbwl2" podUID="bfd0c098-c58b-456c-a9b2-270e749bc274" Mar 10 00:09:51 crc kubenswrapper[4906]: E0310 00:09:51.864477 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29551688-fkkqj" podUID="094c6270-b610-42c0-a6ce-3c146cb6bb6c" Mar 10 00:09:52 crc kubenswrapper[4906]: I0310 00:09:52.583773 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="752d09a7-ead6-439b-a35c-6abc6a00afdb" path="/var/lib/kubelet/pods/752d09a7-ead6-439b-a35c-6abc6a00afdb/volumes" Mar 10 00:09:52 crc kubenswrapper[4906]: I0310 00:09:52.584292 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e695b953-e51d-40d2-aed6-732acb8abff2" path="/var/lib/kubelet/pods/e695b953-e51d-40d2-aed6-732acb8abff2/volumes" Mar 10 00:09:52 crc kubenswrapper[4906]: E0310 00:09:52.973390 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jbwl2" podUID="bfd0c098-c58b-456c-a9b2-270e749bc274" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.012083 4906 scope.go:117] "RemoveContainer" containerID="00a11ce27fa2651dc095569a4f50523c2669f4f81a75b2278c676328403f6e12" Mar 10 00:09:53 crc kubenswrapper[4906]: E0310 00:09:53.058097 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 10 00:09:53 crc kubenswrapper[4906]: E0310 00:09:53.058251 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w58s7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-js2xl_openshift-marketplace(9a68ed18-11e8-4943-854e-a8e4a5566313): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:09:53 crc kubenswrapper[4906]: E0310 00:09:53.059419 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-js2xl" podUID="9a68ed18-11e8-4943-854e-a8e4a5566313" Mar 10 00:09:53 crc kubenswrapper[4906]: E0310 00:09:53.102892 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 10 00:09:53 crc kubenswrapper[4906]: E0310 00:09:53.103106 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zvbd9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-whmcg_openshift-marketplace(f1beb2f4-c1c5-488d-8c76-bed30174a0de): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 00:09:53 crc kubenswrapper[4906]: E0310 00:09:53.104321 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-whmcg" podUID="f1beb2f4-c1c5-488d-8c76-bed30174a0de" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.136526 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6"] Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.137309 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.141220 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.141426 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.142038 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.147196 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.147373 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.147497 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.155716 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29551680-hclh4" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.162825 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6"] Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.198540 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 00:09:53 crc kubenswrapper[4906]: E0310 00:09:53.200129 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d91d282e-a4f3-4bc9-9623-4640774f641a" containerName="image-pruner" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.200150 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="d91d282e-a4f3-4bc9-9623-4640774f641a" containerName="image-pruner" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.200250 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="d91d282e-a4f3-4bc9-9623-4640774f641a" containerName="image-pruner" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.200649 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.204311 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.208739 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.232351 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.274018 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsbqs\" (UniqueName: \"kubernetes.io/projected/d91d282e-a4f3-4bc9-9623-4640774f641a-kube-api-access-gsbqs\") pod \"d91d282e-a4f3-4bc9-9623-4640774f641a\" (UID: \"d91d282e-a4f3-4bc9-9623-4640774f641a\") " Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.274239 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d91d282e-a4f3-4bc9-9623-4640774f641a-serviceca\") pod \"d91d282e-a4f3-4bc9-9623-4640774f641a\" (UID: \"d91d282e-a4f3-4bc9-9623-4640774f641a\") " Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.274498 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84f75957-59f7-4f21-990d-7581456a7c85-config\") pod \"route-controller-manager-6bf76d6c95-wnfp6\" (UID: \"84f75957-59f7-4f21-990d-7581456a7c85\") " pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.274570 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pw52\" (UniqueName: \"kubernetes.io/projected/84f75957-59f7-4f21-990d-7581456a7c85-kube-api-access-9pw52\") pod \"route-controller-manager-6bf76d6c95-wnfp6\" (UID: \"84f75957-59f7-4f21-990d-7581456a7c85\") " pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.274594 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab9454f9-7589-4fb0-ae3a-319be58b20f2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ab9454f9-7589-4fb0-ae3a-319be58b20f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.274630 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84f75957-59f7-4f21-990d-7581456a7c85-serving-cert\") pod \"route-controller-manager-6bf76d6c95-wnfp6\" (UID: \"84f75957-59f7-4f21-990d-7581456a7c85\") " pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.274960 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab9454f9-7589-4fb0-ae3a-319be58b20f2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ab9454f9-7589-4fb0-ae3a-319be58b20f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.275021 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d91d282e-a4f3-4bc9-9623-4640774f641a-serviceca" (OuterVolumeSpecName: "serviceca") pod "d91d282e-a4f3-4bc9-9623-4640774f641a" (UID: "d91d282e-a4f3-4bc9-9623-4640774f641a"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.275061 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84f75957-59f7-4f21-990d-7581456a7c85-client-ca\") pod \"route-controller-manager-6bf76d6c95-wnfp6\" (UID: \"84f75957-59f7-4f21-990d-7581456a7c85\") " pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.275229 4906 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d91d282e-a4f3-4bc9-9623-4640774f641a-serviceca\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.282721 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d91d282e-a4f3-4bc9-9623-4640774f641a-kube-api-access-gsbqs" (OuterVolumeSpecName: "kube-api-access-gsbqs") pod "d91d282e-a4f3-4bc9-9623-4640774f641a" (UID: "d91d282e-a4f3-4bc9-9623-4640774f641a"). InnerVolumeSpecName "kube-api-access-gsbqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.376482 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab9454f9-7589-4fb0-ae3a-319be58b20f2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ab9454f9-7589-4fb0-ae3a-319be58b20f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.376549 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84f75957-59f7-4f21-990d-7581456a7c85-client-ca\") pod \"route-controller-manager-6bf76d6c95-wnfp6\" (UID: \"84f75957-59f7-4f21-990d-7581456a7c85\") " pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.376597 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84f75957-59f7-4f21-990d-7581456a7c85-config\") pod \"route-controller-manager-6bf76d6c95-wnfp6\" (UID: \"84f75957-59f7-4f21-990d-7581456a7c85\") " pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.376671 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pw52\" (UniqueName: \"kubernetes.io/projected/84f75957-59f7-4f21-990d-7581456a7c85-kube-api-access-9pw52\") pod \"route-controller-manager-6bf76d6c95-wnfp6\" (UID: \"84f75957-59f7-4f21-990d-7581456a7c85\") " pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.376699 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab9454f9-7589-4fb0-ae3a-319be58b20f2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ab9454f9-7589-4fb0-ae3a-319be58b20f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.376726 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84f75957-59f7-4f21-990d-7581456a7c85-serving-cert\") pod \"route-controller-manager-6bf76d6c95-wnfp6\" (UID: \"84f75957-59f7-4f21-990d-7581456a7c85\") " pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.376786 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsbqs\" (UniqueName: \"kubernetes.io/projected/d91d282e-a4f3-4bc9-9623-4640774f641a-kube-api-access-gsbqs\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.376841 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab9454f9-7589-4fb0-ae3a-319be58b20f2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ab9454f9-7589-4fb0-ae3a-319be58b20f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.377529 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84f75957-59f7-4f21-990d-7581456a7c85-client-ca\") pod \"route-controller-manager-6bf76d6c95-wnfp6\" (UID: \"84f75957-59f7-4f21-990d-7581456a7c85\") " pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.377706 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84f75957-59f7-4f21-990d-7581456a7c85-config\") pod \"route-controller-manager-6bf76d6c95-wnfp6\" (UID: \"84f75957-59f7-4f21-990d-7581456a7c85\") " pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.381433 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84f75957-59f7-4f21-990d-7581456a7c85-serving-cert\") pod \"route-controller-manager-6bf76d6c95-wnfp6\" (UID: \"84f75957-59f7-4f21-990d-7581456a7c85\") " pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.393651 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab9454f9-7589-4fb0-ae3a-319be58b20f2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ab9454f9-7589-4fb0-ae3a-319be58b20f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.394034 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pw52\" (UniqueName: \"kubernetes.io/projected/84f75957-59f7-4f21-990d-7581456a7c85-kube-api-access-9pw52\") pod \"route-controller-manager-6bf76d6c95-wnfp6\" (UID: \"84f75957-59f7-4f21-990d-7581456a7c85\") " pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:53 crc kubenswrapper[4906]: W0310 00:09:53.517649 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-3bb5270d7286a9ee34b8ba1111afaa12c0ce1b9342444e901830f2251df5380c WatchSource:0}: Error finding container 3bb5270d7286a9ee34b8ba1111afaa12c0ce1b9342444e901830f2251df5380c: Status 404 returned error can't find the container with id 3bb5270d7286a9ee34b8ba1111afaa12c0ce1b9342444e901830f2251df5380c Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.544061 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.559067 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.574741 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-664f87d5d8-7ktrx"] Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.876257 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6"] Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.881331 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b4e70cd6b9a3e4efaa1a0b847f9cac41f0885215983950abc1cee61a183ac73d"} Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.881372 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3548fe8214138ee82b335bcee6d5c6a5686f0ec2a4046c53262b73e8fc419d21"} Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.882806 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29551680-hclh4" event={"ID":"d91d282e-a4f3-4bc9-9623-4640774f641a","Type":"ContainerDied","Data":"8c633a50db1c44e8b86d17ea03a08cf6ee7642ae5e1d973cfc5f07ba0c58702b"} Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.882841 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29551680-hclh4" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.882861 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c633a50db1c44e8b86d17ea03a08cf6ee7642ae5e1d973cfc5f07ba0c58702b" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.885297 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"966cab553059fe0245d7bb0df9f6ece7bb552318d68d2401733ceeacd8145aef"} Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.885344 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3bb5270d7286a9ee34b8ba1111afaa12c0ce1b9342444e901830f2251df5380c"} Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.897307 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6ff55cfcf1213c4f0f42842dfcafa34679b6eee280e834ec3217abebc5d32121"} Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.897370 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4ef041f041a1fac5ae7847eb6c095398b0ab23ef1cd1fdaa4557d1cb4d7dbf21"} Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.907375 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" event={"ID":"e646eb29-cf82-4566-bdb2-6e65f915ad9b","Type":"ContainerStarted","Data":"b8af88492b2ba6afde680e1b2aeed48eb67b708944003fc872aa0fdef2e65345"} Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.907513 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" event={"ID":"e646eb29-cf82-4566-bdb2-6e65f915ad9b","Type":"ContainerStarted","Data":"66cd9f40d73a302112116cadb021b7bcd8e309bd9811c9976b9c665e632440ea"} Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.909688 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.918104 4906 generic.go:334] "Generic (PLEG): container finished" podID="5331074a-1c86-455a-80e9-6f945936e218" containerID="db247e3434e7832999a110107e7ad0e11c7fa548c68d84f4ebdd7bdc54438a1d" exitCode=0 Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.920745 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t65x9" event={"ID":"5331074a-1c86-455a-80e9-6f945936e218","Type":"ContainerDied","Data":"db247e3434e7832999a110107e7ad0e11c7fa548c68d84f4ebdd7bdc54438a1d"} Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.922286 4906 patch_prober.go:28] interesting pod/controller-manager-664f87d5d8-7ktrx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.927620 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" podUID="e646eb29-cf82-4566-bdb2-6e65f915ad9b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.945445 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5t8r" event={"ID":"947e7159-64b1-413f-8cee-daea0a8d0f3e","Type":"ContainerStarted","Data":"710e1b4602077bc734ff3424a05e8f4b902cf0266148e992a74274a460cd25fc"} Mar 10 00:09:53 crc kubenswrapper[4906]: E0310 00:09:53.947300 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-whmcg" podUID="f1beb2f4-c1c5-488d-8c76-bed30174a0de" Mar 10 00:09:53 crc kubenswrapper[4906]: E0310 00:09:53.952411 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-js2xl" podUID="9a68ed18-11e8-4943-854e-a8e4a5566313" Mar 10 00:09:53 crc kubenswrapper[4906]: I0310 00:09:53.985046 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" podStartSLOduration=18.98502316 podStartE2EDuration="18.98502316s" podCreationTimestamp="2026-03-10 00:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:53.965969871 +0000 UTC m=+220.113864983" watchObservedRunningTime="2026-03-10 00:09:53.98502316 +0000 UTC m=+220.132918272" Mar 10 00:09:54 crc kubenswrapper[4906]: I0310 00:09:54.150404 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 00:09:54 crc kubenswrapper[4906]: I0310 00:09:54.955156 4906 generic.go:334] "Generic (PLEG): container finished" podID="947e7159-64b1-413f-8cee-daea0a8d0f3e" containerID="710e1b4602077bc734ff3424a05e8f4b902cf0266148e992a74274a460cd25fc" exitCode=0 Mar 10 00:09:54 crc kubenswrapper[4906]: I0310 00:09:54.955344 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5t8r" event={"ID":"947e7159-64b1-413f-8cee-daea0a8d0f3e","Type":"ContainerDied","Data":"710e1b4602077bc734ff3424a05e8f4b902cf0266148e992a74274a460cd25fc"} Mar 10 00:09:54 crc kubenswrapper[4906]: I0310 00:09:54.957254 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" event={"ID":"84f75957-59f7-4f21-990d-7581456a7c85","Type":"ContainerStarted","Data":"a26895736aec56059cd973a427124eb0187fa3acb75f6c14dbb8a57436f18ea5"} Mar 10 00:09:54 crc kubenswrapper[4906]: I0310 00:09:54.957285 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" event={"ID":"84f75957-59f7-4f21-990d-7581456a7c85","Type":"ContainerStarted","Data":"8b87df0e2360784d66eab9a04e6539221098c6e58726fcfc4eda1402fef8a295"} Mar 10 00:09:54 crc kubenswrapper[4906]: I0310 00:09:54.958186 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:54 crc kubenswrapper[4906]: I0310 00:09:54.961097 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ab9454f9-7589-4fb0-ae3a-319be58b20f2","Type":"ContainerStarted","Data":"939c4f63057e9048d0d277b1be0a61275e7a72d784613d0591b98df2174db126"} Mar 10 00:09:54 crc kubenswrapper[4906]: I0310 00:09:54.961175 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ab9454f9-7589-4fb0-ae3a-319be58b20f2","Type":"ContainerStarted","Data":"270232557bf82a75a1ef48b7aadbe9f729efa50c05a1ee7e3012a6a8767143bd"} Mar 10 00:09:54 crc kubenswrapper[4906]: I0310 00:09:54.961347 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:09:54 crc kubenswrapper[4906]: I0310 00:09:54.964049 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:54 crc kubenswrapper[4906]: I0310 00:09:54.965802 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:54 crc kubenswrapper[4906]: I0310 00:09:54.991536 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" podStartSLOduration=19.991517061 podStartE2EDuration="19.991517061s" podCreationTimestamp="2026-03-10 00:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:54.986965523 +0000 UTC m=+221.134860635" watchObservedRunningTime="2026-03-10 00:09:54.991517061 +0000 UTC m=+221.139412173" Mar 10 00:09:55 crc kubenswrapper[4906]: I0310 00:09:55.060331 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.060310866 podStartE2EDuration="2.060310866s" podCreationTimestamp="2026-03-10 00:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:09:55.056764025 +0000 UTC m=+221.204659137" watchObservedRunningTime="2026-03-10 00:09:55.060310866 +0000 UTC m=+221.208205978" Mar 10 00:09:55 crc kubenswrapper[4906]: I0310 00:09:55.170378 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-664f87d5d8-7ktrx"] Mar 10 00:09:55 crc kubenswrapper[4906]: I0310 00:09:55.260849 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6"] Mar 10 00:09:55 crc kubenswrapper[4906]: I0310 00:09:55.971944 4906 generic.go:334] "Generic (PLEG): container finished" podID="ab9454f9-7589-4fb0-ae3a-319be58b20f2" containerID="939c4f63057e9048d0d277b1be0a61275e7a72d784613d0591b98df2174db126" exitCode=0 Mar 10 00:09:55 crc kubenswrapper[4906]: I0310 00:09:55.972177 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ab9454f9-7589-4fb0-ae3a-319be58b20f2","Type":"ContainerDied","Data":"939c4f63057e9048d0d277b1be0a61275e7a72d784613d0591b98df2174db126"} Mar 10 00:09:56 crc kubenswrapper[4906]: I0310 00:09:56.988994 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5t8r" event={"ID":"947e7159-64b1-413f-8cee-daea0a8d0f3e","Type":"ContainerStarted","Data":"5e87f60c38a416b4361c3d348c8d7fd07655cfdd139585279fd93a787cdacb4c"} Mar 10 00:09:56 crc kubenswrapper[4906]: I0310 00:09:56.990582 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t65x9" event={"ID":"5331074a-1c86-455a-80e9-6f945936e218","Type":"ContainerStarted","Data":"4af373b75c87b9e13e9f5b3ce8a670161ab21c49d5861a970763f95a4475c871"} Mar 10 00:09:56 crc kubenswrapper[4906]: I0310 00:09:56.990766 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" podUID="e646eb29-cf82-4566-bdb2-6e65f915ad9b" containerName="controller-manager" containerID="cri-o://b8af88492b2ba6afde680e1b2aeed48eb67b708944003fc872aa0fdef2e65345" gracePeriod=30 Mar 10 00:09:56 crc kubenswrapper[4906]: I0310 00:09:56.990805 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" podUID="84f75957-59f7-4f21-990d-7581456a7c85" containerName="route-controller-manager" containerID="cri-o://a26895736aec56059cd973a427124eb0187fa3acb75f6c14dbb8a57436f18ea5" gracePeriod=30 Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.021847 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m5t8r" podStartSLOduration=4.571048945 podStartE2EDuration="39.021829203s" podCreationTimestamp="2026-03-10 00:09:18 +0000 UTC" firstStartedPulling="2026-03-10 00:09:21.486975162 +0000 UTC m=+187.634870274" lastFinishedPulling="2026-03-10 00:09:55.93775542 +0000 UTC m=+222.085650532" observedRunningTime="2026-03-10 00:09:57.020364412 +0000 UTC m=+223.168259544" watchObservedRunningTime="2026-03-10 00:09:57.021829203 +0000 UTC m=+223.169724315" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.040531 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t65x9" podStartSLOduration=4.194411553 podStartE2EDuration="42.040515611s" podCreationTimestamp="2026-03-10 00:09:15 +0000 UTC" firstStartedPulling="2026-03-10 00:09:18.039875099 +0000 UTC m=+184.187770211" lastFinishedPulling="2026-03-10 00:09:55.885979167 +0000 UTC m=+222.033874269" observedRunningTime="2026-03-10 00:09:57.039716189 +0000 UTC m=+223.187611301" watchObservedRunningTime="2026-03-10 00:09:57.040515611 +0000 UTC m=+223.188410713" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.241587 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.381468 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab9454f9-7589-4fb0-ae3a-319be58b20f2-kubelet-dir\") pod \"ab9454f9-7589-4fb0-ae3a-319be58b20f2\" (UID: \"ab9454f9-7589-4fb0-ae3a-319be58b20f2\") " Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.381548 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab9454f9-7589-4fb0-ae3a-319be58b20f2-kube-api-access\") pod \"ab9454f9-7589-4fb0-ae3a-319be58b20f2\" (UID: \"ab9454f9-7589-4fb0-ae3a-319be58b20f2\") " Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.387963 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab9454f9-7589-4fb0-ae3a-319be58b20f2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ab9454f9-7589-4fb0-ae3a-319be58b20f2" (UID: "ab9454f9-7589-4fb0-ae3a-319be58b20f2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.393386 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab9454f9-7589-4fb0-ae3a-319be58b20f2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ab9454f9-7589-4fb0-ae3a-319be58b20f2" (UID: "ab9454f9-7589-4fb0-ae3a-319be58b20f2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.482762 4906 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab9454f9-7589-4fb0-ae3a-319be58b20f2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.482797 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab9454f9-7589-4fb0-ae3a-319be58b20f2-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.494939 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.498365 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.685262 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnlnd\" (UniqueName: \"kubernetes.io/projected/e646eb29-cf82-4566-bdb2-6e65f915ad9b-kube-api-access-vnlnd\") pod \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.685332 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pw52\" (UniqueName: \"kubernetes.io/projected/84f75957-59f7-4f21-990d-7581456a7c85-kube-api-access-9pw52\") pod \"84f75957-59f7-4f21-990d-7581456a7c85\" (UID: \"84f75957-59f7-4f21-990d-7581456a7c85\") " Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.685422 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84f75957-59f7-4f21-990d-7581456a7c85-serving-cert\") pod \"84f75957-59f7-4f21-990d-7581456a7c85\" (UID: \"84f75957-59f7-4f21-990d-7581456a7c85\") " Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.685453 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84f75957-59f7-4f21-990d-7581456a7c85-client-ca\") pod \"84f75957-59f7-4f21-990d-7581456a7c85\" (UID: \"84f75957-59f7-4f21-990d-7581456a7c85\") " Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.685479 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84f75957-59f7-4f21-990d-7581456a7c85-config\") pod \"84f75957-59f7-4f21-990d-7581456a7c85\" (UID: \"84f75957-59f7-4f21-990d-7581456a7c85\") " Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.685501 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e646eb29-cf82-4566-bdb2-6e65f915ad9b-serving-cert\") pod \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.685518 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-proxy-ca-bundles\") pod \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.685535 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-config\") pod \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.685567 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-client-ca\") pod \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\" (UID: \"e646eb29-cf82-4566-bdb2-6e65f915ad9b\") " Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.686614 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-client-ca" (OuterVolumeSpecName: "client-ca") pod "e646eb29-cf82-4566-bdb2-6e65f915ad9b" (UID: "e646eb29-cf82-4566-bdb2-6e65f915ad9b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.686680 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e646eb29-cf82-4566-bdb2-6e65f915ad9b" (UID: "e646eb29-cf82-4566-bdb2-6e65f915ad9b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.686711 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84f75957-59f7-4f21-990d-7581456a7c85-client-ca" (OuterVolumeSpecName: "client-ca") pod "84f75957-59f7-4f21-990d-7581456a7c85" (UID: "84f75957-59f7-4f21-990d-7581456a7c85"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.686895 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-config" (OuterVolumeSpecName: "config") pod "e646eb29-cf82-4566-bdb2-6e65f915ad9b" (UID: "e646eb29-cf82-4566-bdb2-6e65f915ad9b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.687020 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84f75957-59f7-4f21-990d-7581456a7c85-config" (OuterVolumeSpecName: "config") pod "84f75957-59f7-4f21-990d-7581456a7c85" (UID: "84f75957-59f7-4f21-990d-7581456a7c85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.690570 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e646eb29-cf82-4566-bdb2-6e65f915ad9b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e646eb29-cf82-4566-bdb2-6e65f915ad9b" (UID: "e646eb29-cf82-4566-bdb2-6e65f915ad9b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.690624 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84f75957-59f7-4f21-990d-7581456a7c85-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "84f75957-59f7-4f21-990d-7581456a7c85" (UID: "84f75957-59f7-4f21-990d-7581456a7c85"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.692744 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84f75957-59f7-4f21-990d-7581456a7c85-kube-api-access-9pw52" (OuterVolumeSpecName: "kube-api-access-9pw52") pod "84f75957-59f7-4f21-990d-7581456a7c85" (UID: "84f75957-59f7-4f21-990d-7581456a7c85"). InnerVolumeSpecName "kube-api-access-9pw52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.692996 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e646eb29-cf82-4566-bdb2-6e65f915ad9b-kube-api-access-vnlnd" (OuterVolumeSpecName: "kube-api-access-vnlnd") pod "e646eb29-cf82-4566-bdb2-6e65f915ad9b" (UID: "e646eb29-cf82-4566-bdb2-6e65f915ad9b"). InnerVolumeSpecName "kube-api-access-vnlnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.787299 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnlnd\" (UniqueName: \"kubernetes.io/projected/e646eb29-cf82-4566-bdb2-6e65f915ad9b-kube-api-access-vnlnd\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.787348 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pw52\" (UniqueName: \"kubernetes.io/projected/84f75957-59f7-4f21-990d-7581456a7c85-kube-api-access-9pw52\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.787363 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84f75957-59f7-4f21-990d-7581456a7c85-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.787376 4906 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/84f75957-59f7-4f21-990d-7581456a7c85-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.787386 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84f75957-59f7-4f21-990d-7581456a7c85-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.787972 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e646eb29-cf82-4566-bdb2-6e65f915ad9b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.788025 4906 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.788046 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.788061 4906 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e646eb29-cf82-4566-bdb2-6e65f915ad9b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.998309 4906 generic.go:334] "Generic (PLEG): container finished" podID="e646eb29-cf82-4566-bdb2-6e65f915ad9b" containerID="b8af88492b2ba6afde680e1b2aeed48eb67b708944003fc872aa0fdef2e65345" exitCode=0 Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.998384 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" event={"ID":"e646eb29-cf82-4566-bdb2-6e65f915ad9b","Type":"ContainerDied","Data":"b8af88492b2ba6afde680e1b2aeed48eb67b708944003fc872aa0fdef2e65345"} Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.998418 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" event={"ID":"e646eb29-cf82-4566-bdb2-6e65f915ad9b","Type":"ContainerDied","Data":"66cd9f40d73a302112116cadb021b7bcd8e309bd9811c9976b9c665e632440ea"} Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.998438 4906 scope.go:117] "RemoveContainer" containerID="b8af88492b2ba6afde680e1b2aeed48eb67b708944003fc872aa0fdef2e65345" Mar 10 00:09:57 crc kubenswrapper[4906]: I0310 00:09:57.998447 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-664f87d5d8-7ktrx" Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.000526 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.000531 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ab9454f9-7589-4fb0-ae3a-319be58b20f2","Type":"ContainerDied","Data":"270232557bf82a75a1ef48b7aadbe9f729efa50c05a1ee7e3012a6a8767143bd"} Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.001466 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="270232557bf82a75a1ef48b7aadbe9f729efa50c05a1ee7e3012a6a8767143bd" Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.003014 4906 generic.go:334] "Generic (PLEG): container finished" podID="84f75957-59f7-4f21-990d-7581456a7c85" containerID="a26895736aec56059cd973a427124eb0187fa3acb75f6c14dbb8a57436f18ea5" exitCode=0 Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.005127 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.010217 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" event={"ID":"84f75957-59f7-4f21-990d-7581456a7c85","Type":"ContainerDied","Data":"a26895736aec56059cd973a427124eb0187fa3acb75f6c14dbb8a57436f18ea5"} Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.010251 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6" event={"ID":"84f75957-59f7-4f21-990d-7581456a7c85","Type":"ContainerDied","Data":"8b87df0e2360784d66eab9a04e6539221098c6e58726fcfc4eda1402fef8a295"} Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.020010 4906 scope.go:117] "RemoveContainer" containerID="b8af88492b2ba6afde680e1b2aeed48eb67b708944003fc872aa0fdef2e65345" Mar 10 00:09:58 crc kubenswrapper[4906]: E0310 00:09:58.020465 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8af88492b2ba6afde680e1b2aeed48eb67b708944003fc872aa0fdef2e65345\": container with ID starting with b8af88492b2ba6afde680e1b2aeed48eb67b708944003fc872aa0fdef2e65345 not found: ID does not exist" containerID="b8af88492b2ba6afde680e1b2aeed48eb67b708944003fc872aa0fdef2e65345" Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.020507 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8af88492b2ba6afde680e1b2aeed48eb67b708944003fc872aa0fdef2e65345"} err="failed to get container status \"b8af88492b2ba6afde680e1b2aeed48eb67b708944003fc872aa0fdef2e65345\": rpc error: code = NotFound desc = could not find container \"b8af88492b2ba6afde680e1b2aeed48eb67b708944003fc872aa0fdef2e65345\": container with ID starting with b8af88492b2ba6afde680e1b2aeed48eb67b708944003fc872aa0fdef2e65345 not found: ID does not exist" Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.020550 4906 scope.go:117] "RemoveContainer" containerID="a26895736aec56059cd973a427124eb0187fa3acb75f6c14dbb8a57436f18ea5" Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.036910 4906 scope.go:117] "RemoveContainer" containerID="a26895736aec56059cd973a427124eb0187fa3acb75f6c14dbb8a57436f18ea5" Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.037044 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-664f87d5d8-7ktrx"] Mar 10 00:09:58 crc kubenswrapper[4906]: E0310 00:09:58.037609 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a26895736aec56059cd973a427124eb0187fa3acb75f6c14dbb8a57436f18ea5\": container with ID starting with a26895736aec56059cd973a427124eb0187fa3acb75f6c14dbb8a57436f18ea5 not found: ID does not exist" containerID="a26895736aec56059cd973a427124eb0187fa3acb75f6c14dbb8a57436f18ea5" Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.037748 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a26895736aec56059cd973a427124eb0187fa3acb75f6c14dbb8a57436f18ea5"} err="failed to get container status \"a26895736aec56059cd973a427124eb0187fa3acb75f6c14dbb8a57436f18ea5\": rpc error: code = NotFound desc = could not find container \"a26895736aec56059cd973a427124eb0187fa3acb75f6c14dbb8a57436f18ea5\": container with ID starting with a26895736aec56059cd973a427124eb0187fa3acb75f6c14dbb8a57436f18ea5 not found: ID does not exist" Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.040824 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-664f87d5d8-7ktrx"] Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.053804 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6"] Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.056387 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bf76d6c95-wnfp6"] Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.586550 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84f75957-59f7-4f21-990d-7581456a7c85" path="/var/lib/kubelet/pods/84f75957-59f7-4f21-990d-7581456a7c85/volumes" Mar 10 00:09:58 crc kubenswrapper[4906]: I0310 00:09:58.587891 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e646eb29-cf82-4566-bdb2-6e65f915ad9b" path="/var/lib/kubelet/pods/e646eb29-cf82-4566-bdb2-6e65f915ad9b/volumes" Mar 10 00:09:59 crc kubenswrapper[4906]: I0310 00:09:59.084780 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mq564"] Mar 10 00:09:59 crc kubenswrapper[4906]: I0310 00:09:59.323604 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m5t8r" Mar 10 00:09:59 crc kubenswrapper[4906]: I0310 00:09:59.323672 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m5t8r" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.135272 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551690-8jrvf"] Mar 10 00:10:00 crc kubenswrapper[4906]: E0310 00:10:00.136063 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9454f9-7589-4fb0-ae3a-319be58b20f2" containerName="pruner" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.136083 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9454f9-7589-4fb0-ae3a-319be58b20f2" containerName="pruner" Mar 10 00:10:00 crc kubenswrapper[4906]: E0310 00:10:00.136095 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84f75957-59f7-4f21-990d-7581456a7c85" containerName="route-controller-manager" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.136103 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="84f75957-59f7-4f21-990d-7581456a7c85" containerName="route-controller-manager" Mar 10 00:10:00 crc kubenswrapper[4906]: E0310 00:10:00.136112 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e646eb29-cf82-4566-bdb2-6e65f915ad9b" containerName="controller-manager" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.136123 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="e646eb29-cf82-4566-bdb2-6e65f915ad9b" containerName="controller-manager" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.136259 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9454f9-7589-4fb0-ae3a-319be58b20f2" containerName="pruner" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.136277 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="e646eb29-cf82-4566-bdb2-6e65f915ad9b" containerName="controller-manager" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.136288 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="84f75957-59f7-4f21-990d-7581456a7c85" containerName="route-controller-manager" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.136808 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551690-8jrvf" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.139423 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ktdr6" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.142984 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551690-8jrvf"] Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.193162 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j"] Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.194504 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.196005 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5868864b8c-z4kkl"] Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.196791 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.196947 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.196818 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.197261 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.197353 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.197457 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.197506 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.203911 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.204005 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.204169 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.203915 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.204301 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.203959 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.205665 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5868864b8c-z4kkl"] Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.207775 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.208181 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j"] Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.218451 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/accac582-bb83-4d5c-ae2d-48797c5aeb03-config\") pod \"route-controller-manager-5d786d5d8c-6l67j\" (UID: \"accac582-bb83-4d5c-ae2d-48797c5aeb03\") " pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.218498 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/075a851f-d16a-43b4-8ced-44073d4f7810-serving-cert\") pod \"controller-manager-5868864b8c-z4kkl\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.218522 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-config\") pod \"controller-manager-5868864b8c-z4kkl\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.218544 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-proxy-ca-bundles\") pod \"controller-manager-5868864b8c-z4kkl\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.218561 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/accac582-bb83-4d5c-ae2d-48797c5aeb03-serving-cert\") pod \"route-controller-manager-5d786d5d8c-6l67j\" (UID: \"accac582-bb83-4d5c-ae2d-48797c5aeb03\") " pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.218580 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8wh7\" (UniqueName: \"kubernetes.io/projected/accac582-bb83-4d5c-ae2d-48797c5aeb03-kube-api-access-h8wh7\") pod \"route-controller-manager-5d786d5d8c-6l67j\" (UID: \"accac582-bb83-4d5c-ae2d-48797c5aeb03\") " pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.218600 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/accac582-bb83-4d5c-ae2d-48797c5aeb03-client-ca\") pod \"route-controller-manager-5d786d5d8c-6l67j\" (UID: \"accac582-bb83-4d5c-ae2d-48797c5aeb03\") " pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.218623 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddfnp\" (UniqueName: \"kubernetes.io/projected/075a851f-d16a-43b4-8ced-44073d4f7810-kube-api-access-ddfnp\") pod \"controller-manager-5868864b8c-z4kkl\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.218660 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-client-ca\") pod \"controller-manager-5868864b8c-z4kkl\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.218680 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv8jd\" (UniqueName: \"kubernetes.io/projected/d18b4f28-fc85-4aaf-af80-4272b80ef138-kube-api-access-pv8jd\") pod \"auto-csr-approver-29551690-8jrvf\" (UID: \"d18b4f28-fc85-4aaf-af80-4272b80ef138\") " pod="openshift-infra/auto-csr-approver-29551690-8jrvf" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.319657 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-config\") pod \"controller-manager-5868864b8c-z4kkl\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.320063 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-proxy-ca-bundles\") pod \"controller-manager-5868864b8c-z4kkl\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.320114 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/accac582-bb83-4d5c-ae2d-48797c5aeb03-serving-cert\") pod \"route-controller-manager-5d786d5d8c-6l67j\" (UID: \"accac582-bb83-4d5c-ae2d-48797c5aeb03\") " pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.320161 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8wh7\" (UniqueName: \"kubernetes.io/projected/accac582-bb83-4d5c-ae2d-48797c5aeb03-kube-api-access-h8wh7\") pod \"route-controller-manager-5d786d5d8c-6l67j\" (UID: \"accac582-bb83-4d5c-ae2d-48797c5aeb03\") " pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.320192 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/accac582-bb83-4d5c-ae2d-48797c5aeb03-client-ca\") pod \"route-controller-manager-5d786d5d8c-6l67j\" (UID: \"accac582-bb83-4d5c-ae2d-48797c5aeb03\") " pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.320224 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddfnp\" (UniqueName: \"kubernetes.io/projected/075a851f-d16a-43b4-8ced-44073d4f7810-kube-api-access-ddfnp\") pod \"controller-manager-5868864b8c-z4kkl\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.320251 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-client-ca\") pod \"controller-manager-5868864b8c-z4kkl\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.320270 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv8jd\" (UniqueName: \"kubernetes.io/projected/d18b4f28-fc85-4aaf-af80-4272b80ef138-kube-api-access-pv8jd\") pod \"auto-csr-approver-29551690-8jrvf\" (UID: \"d18b4f28-fc85-4aaf-af80-4272b80ef138\") " pod="openshift-infra/auto-csr-approver-29551690-8jrvf" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.320352 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/accac582-bb83-4d5c-ae2d-48797c5aeb03-config\") pod \"route-controller-manager-5d786d5d8c-6l67j\" (UID: \"accac582-bb83-4d5c-ae2d-48797c5aeb03\") " pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.320382 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/075a851f-d16a-43b4-8ced-44073d4f7810-serving-cert\") pod \"controller-manager-5868864b8c-z4kkl\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.321216 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/accac582-bb83-4d5c-ae2d-48797c5aeb03-client-ca\") pod \"route-controller-manager-5d786d5d8c-6l67j\" (UID: \"accac582-bb83-4d5c-ae2d-48797c5aeb03\") " pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.322317 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-client-ca\") pod \"controller-manager-5868864b8c-z4kkl\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.322775 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/accac582-bb83-4d5c-ae2d-48797c5aeb03-config\") pod \"route-controller-manager-5d786d5d8c-6l67j\" (UID: \"accac582-bb83-4d5c-ae2d-48797c5aeb03\") " pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.323404 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-proxy-ca-bundles\") pod \"controller-manager-5868864b8c-z4kkl\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.323780 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-config\") pod \"controller-manager-5868864b8c-z4kkl\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.330003 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/accac582-bb83-4d5c-ae2d-48797c5aeb03-serving-cert\") pod \"route-controller-manager-5d786d5d8c-6l67j\" (UID: \"accac582-bb83-4d5c-ae2d-48797c5aeb03\") " pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.332557 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/075a851f-d16a-43b4-8ced-44073d4f7810-serving-cert\") pod \"controller-manager-5868864b8c-z4kkl\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.336477 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv8jd\" (UniqueName: \"kubernetes.io/projected/d18b4f28-fc85-4aaf-af80-4272b80ef138-kube-api-access-pv8jd\") pod \"auto-csr-approver-29551690-8jrvf\" (UID: \"d18b4f28-fc85-4aaf-af80-4272b80ef138\") " pod="openshift-infra/auto-csr-approver-29551690-8jrvf" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.339085 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddfnp\" (UniqueName: \"kubernetes.io/projected/075a851f-d16a-43b4-8ced-44073d4f7810-kube-api-access-ddfnp\") pod \"controller-manager-5868864b8c-z4kkl\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.340378 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8wh7\" (UniqueName: \"kubernetes.io/projected/accac582-bb83-4d5c-ae2d-48797c5aeb03-kube-api-access-h8wh7\") pod \"route-controller-manager-5d786d5d8c-6l67j\" (UID: \"accac582-bb83-4d5c-ae2d-48797c5aeb03\") " pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.459932 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551690-8jrvf" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.477818 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m5t8r" podUID="947e7159-64b1-413f-8cee-daea0a8d0f3e" containerName="registry-server" probeResult="failure" output=< Mar 10 00:10:00 crc kubenswrapper[4906]: timeout: failed to connect service ":50051" within 1s Mar 10 00:10:00 crc kubenswrapper[4906]: > Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.502686 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.502757 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.511887 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.521440 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.788005 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.789290 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.792070 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.794342 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.802806 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.899261 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5868864b8c-z4kkl"] Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.915430 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551690-8jrvf"] Mar 10 00:10:00 crc kubenswrapper[4906]: W0310 00:10:00.916053 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod075a851f_d16a_43b4_8ced_44073d4f7810.slice/crio-a4f50e3865a4b10449ca40a6aebc57a8fd9cfdbf1616dd2a32fd96b271050dcb WatchSource:0}: Error finding container a4f50e3865a4b10449ca40a6aebc57a8fd9cfdbf1616dd2a32fd96b271050dcb: Status 404 returned error can't find the container with id a4f50e3865a4b10449ca40a6aebc57a8fd9cfdbf1616dd2a32fd96b271050dcb Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.929878 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffc7eabb-e49f-497c-912d-f997514651c5-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ffc7eabb-e49f-497c-912d-f997514651c5\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.930045 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ffc7eabb-e49f-497c-912d-f997514651c5-var-lock\") pod \"installer-9-crc\" (UID: \"ffc7eabb-e49f-497c-912d-f997514651c5\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:00 crc kubenswrapper[4906]: I0310 00:10:00.930085 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffc7eabb-e49f-497c-912d-f997514651c5-kube-api-access\") pod \"installer-9-crc\" (UID: \"ffc7eabb-e49f-497c-912d-f997514651c5\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:01 crc kubenswrapper[4906]: I0310 00:10:01.018421 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551690-8jrvf" event={"ID":"d18b4f28-fc85-4aaf-af80-4272b80ef138","Type":"ContainerStarted","Data":"bf193ee1b4c6517efc77e0801d0ddde269a8ced1f84b139735b55723970d2607"} Mar 10 00:10:01 crc kubenswrapper[4906]: I0310 00:10:01.019236 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" event={"ID":"075a851f-d16a-43b4-8ced-44073d4f7810","Type":"ContainerStarted","Data":"a4f50e3865a4b10449ca40a6aebc57a8fd9cfdbf1616dd2a32fd96b271050dcb"} Mar 10 00:10:01 crc kubenswrapper[4906]: I0310 00:10:01.031024 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ffc7eabb-e49f-497c-912d-f997514651c5-var-lock\") pod \"installer-9-crc\" (UID: \"ffc7eabb-e49f-497c-912d-f997514651c5\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:01 crc kubenswrapper[4906]: I0310 00:10:01.031080 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffc7eabb-e49f-497c-912d-f997514651c5-kube-api-access\") pod \"installer-9-crc\" (UID: \"ffc7eabb-e49f-497c-912d-f997514651c5\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:01 crc kubenswrapper[4906]: I0310 00:10:01.031113 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffc7eabb-e49f-497c-912d-f997514651c5-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ffc7eabb-e49f-497c-912d-f997514651c5\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:01 crc kubenswrapper[4906]: I0310 00:10:01.031190 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffc7eabb-e49f-497c-912d-f997514651c5-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ffc7eabb-e49f-497c-912d-f997514651c5\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:01 crc kubenswrapper[4906]: I0310 00:10:01.031223 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ffc7eabb-e49f-497c-912d-f997514651c5-var-lock\") pod \"installer-9-crc\" (UID: \"ffc7eabb-e49f-497c-912d-f997514651c5\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:01 crc kubenswrapper[4906]: I0310 00:10:01.041057 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j"] Mar 10 00:10:01 crc kubenswrapper[4906]: W0310 00:10:01.047825 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaccac582_bb83_4d5c_ae2d_48797c5aeb03.slice/crio-b41485d866be1c520b2ce6ad7c5846f48e5232b25e9b6ebe39e8f5ce2b6105bb WatchSource:0}: Error finding container b41485d866be1c520b2ce6ad7c5846f48e5232b25e9b6ebe39e8f5ce2b6105bb: Status 404 returned error can't find the container with id b41485d866be1c520b2ce6ad7c5846f48e5232b25e9b6ebe39e8f5ce2b6105bb Mar 10 00:10:01 crc kubenswrapper[4906]: I0310 00:10:01.053691 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffc7eabb-e49f-497c-912d-f997514651c5-kube-api-access\") pod \"installer-9-crc\" (UID: \"ffc7eabb-e49f-497c-912d-f997514651c5\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:01 crc kubenswrapper[4906]: I0310 00:10:01.113284 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:01 crc kubenswrapper[4906]: I0310 00:10:01.336597 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 00:10:02 crc kubenswrapper[4906]: I0310 00:10:02.026491 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" event={"ID":"accac582-bb83-4d5c-ae2d-48797c5aeb03","Type":"ContainerStarted","Data":"9fa9d9965f2af145a772981fd6397ed8132ab7e390f7e92d3ccbe88fb7afd3f9"} Mar 10 00:10:02 crc kubenswrapper[4906]: I0310 00:10:02.026884 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:02 crc kubenswrapper[4906]: I0310 00:10:02.026899 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" event={"ID":"accac582-bb83-4d5c-ae2d-48797c5aeb03","Type":"ContainerStarted","Data":"b41485d866be1c520b2ce6ad7c5846f48e5232b25e9b6ebe39e8f5ce2b6105bb"} Mar 10 00:10:02 crc kubenswrapper[4906]: I0310 00:10:02.028695 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" event={"ID":"075a851f-d16a-43b4-8ced-44073d4f7810","Type":"ContainerStarted","Data":"8412ad8d4ec241b26510eebc9ee336d39ac36878db237652761b29dfae86fc2b"} Mar 10 00:10:02 crc kubenswrapper[4906]: I0310 00:10:02.029294 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:02 crc kubenswrapper[4906]: I0310 00:10:02.030622 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ffc7eabb-e49f-497c-912d-f997514651c5","Type":"ContainerStarted","Data":"dc6fcf0a68b971ef82fe314f5aaa86e92f754f07405cc299eeb8aa43fba66717"} Mar 10 00:10:02 crc kubenswrapper[4906]: I0310 00:10:02.030694 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ffc7eabb-e49f-497c-912d-f997514651c5","Type":"ContainerStarted","Data":"9b5b4b7f7d1c2388ae0a54f8e94b505c981764d43eaa2c0cb8372a937bced375"} Mar 10 00:10:02 crc kubenswrapper[4906]: I0310 00:10:02.033904 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:02 crc kubenswrapper[4906]: I0310 00:10:02.034671 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:02 crc kubenswrapper[4906]: I0310 00:10:02.071438 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" podStartSLOduration=7.071410404 podStartE2EDuration="7.071410404s" podCreationTimestamp="2026-03-10 00:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:02.048815326 +0000 UTC m=+228.196710438" watchObservedRunningTime="2026-03-10 00:10:02.071410404 +0000 UTC m=+228.219305516" Mar 10 00:10:02 crc kubenswrapper[4906]: I0310 00:10:02.099853 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" podStartSLOduration=7.099828977 podStartE2EDuration="7.099828977s" podCreationTimestamp="2026-03-10 00:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:02.098705055 +0000 UTC m=+228.246600167" watchObservedRunningTime="2026-03-10 00:10:02.099828977 +0000 UTC m=+228.247724089" Mar 10 00:10:03 crc kubenswrapper[4906]: I0310 00:10:03.064060 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.064011843 podStartE2EDuration="3.064011843s" podCreationTimestamp="2026-03-10 00:10:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:03.05751633 +0000 UTC m=+229.205411442" watchObservedRunningTime="2026-03-10 00:10:03.064011843 +0000 UTC m=+229.211906985" Mar 10 00:10:05 crc kubenswrapper[4906]: I0310 00:10:05.746616 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t65x9" Mar 10 00:10:05 crc kubenswrapper[4906]: I0310 00:10:05.747086 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t65x9" Mar 10 00:10:05 crc kubenswrapper[4906]: I0310 00:10:05.792704 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t65x9" Mar 10 00:10:05 crc kubenswrapper[4906]: I0310 00:10:05.800584 4906 csr.go:261] certificate signing request csr-t4hrd is approved, waiting to be issued Mar 10 00:10:05 crc kubenswrapper[4906]: I0310 00:10:05.812421 4906 csr.go:257] certificate signing request csr-t4hrd is issued Mar 10 00:10:06 crc kubenswrapper[4906]: I0310 00:10:06.066424 4906 generic.go:334] "Generic (PLEG): container finished" podID="52694cc4-226b-4bd5-a6c7-0ebd711926e2" containerID="3bb25fbb8b0ed4921199667d54f1c840740b41d8811ac53682480adc15a4743b" exitCode=0 Mar 10 00:10:06 crc kubenswrapper[4906]: I0310 00:10:06.066538 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhmhd" event={"ID":"52694cc4-226b-4bd5-a6c7-0ebd711926e2","Type":"ContainerDied","Data":"3bb25fbb8b0ed4921199667d54f1c840740b41d8811ac53682480adc15a4743b"} Mar 10 00:10:06 crc kubenswrapper[4906]: I0310 00:10:06.069581 4906 generic.go:334] "Generic (PLEG): container finished" podID="d18b4f28-fc85-4aaf-af80-4272b80ef138" containerID="ce97fe799a72e7dbdd8f5268ed7002df541d96660f4f6d940fc047b5ee3643d7" exitCode=0 Mar 10 00:10:06 crc kubenswrapper[4906]: I0310 00:10:06.069657 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551690-8jrvf" event={"ID":"d18b4f28-fc85-4aaf-af80-4272b80ef138","Type":"ContainerDied","Data":"ce97fe799a72e7dbdd8f5268ed7002df541d96660f4f6d940fc047b5ee3643d7"} Mar 10 00:10:06 crc kubenswrapper[4906]: I0310 00:10:06.077094 4906 generic.go:334] "Generic (PLEG): container finished" podID="68c8b80a-0af0-46cb-8a57-a353444de9dc" containerID="bdec8f9b1877ddde359a4f10d14b9aad9979f8d6d4c0d1b1a9d1a0282ca458d1" exitCode=0 Mar 10 00:10:06 crc kubenswrapper[4906]: I0310 00:10:06.077136 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwlr8" event={"ID":"68c8b80a-0af0-46cb-8a57-a353444de9dc","Type":"ContainerDied","Data":"bdec8f9b1877ddde359a4f10d14b9aad9979f8d6d4c0d1b1a9d1a0282ca458d1"} Mar 10 00:10:06 crc kubenswrapper[4906]: I0310 00:10:06.127416 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t65x9" Mar 10 00:10:06 crc kubenswrapper[4906]: I0310 00:10:06.813614 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-10 11:57:01.351273337 +0000 UTC Mar 10 00:10:06 crc kubenswrapper[4906]: I0310 00:10:06.814010 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7355h46m54.537265957s for next certificate rotation Mar 10 00:10:07 crc kubenswrapper[4906]: I0310 00:10:07.087275 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhmhd" event={"ID":"52694cc4-226b-4bd5-a6c7-0ebd711926e2","Type":"ContainerStarted","Data":"086773ee90732ab31fb14a62f923340b51b27dd62d3352d268d528ae89554642"} Mar 10 00:10:07 crc kubenswrapper[4906]: I0310 00:10:07.090395 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwlr8" event={"ID":"68c8b80a-0af0-46cb-8a57-a353444de9dc","Type":"ContainerStarted","Data":"22f829c1c6b4ebd7aa3a25c52f981aa5da2781bd3283551c47f8df2849453d92"} Mar 10 00:10:07 crc kubenswrapper[4906]: I0310 00:10:07.092511 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvvhx" event={"ID":"dc4d2e8f-54ca-464b-b186-432747b22864","Type":"ContainerStarted","Data":"0a3bec7e1c35fcaae5318f83d776257e5b5d9ce3cb7961baf70b55652f175381"} Mar 10 00:10:07 crc kubenswrapper[4906]: I0310 00:10:07.094565 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551688-fkkqj" event={"ID":"094c6270-b610-42c0-a6ce-3c146cb6bb6c","Type":"ContainerStarted","Data":"82c8cc3ef8c5b872fd0f4d97b18b2e2f95cbc6b451ee0407027a5c758f1f15d6"} Mar 10 00:10:07 crc kubenswrapper[4906]: I0310 00:10:07.108608 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fhmhd" podStartSLOduration=3.261562653 podStartE2EDuration="52.108585194s" podCreationTimestamp="2026-03-10 00:09:15 +0000 UTC" firstStartedPulling="2026-03-10 00:09:18.074203549 +0000 UTC m=+184.222098651" lastFinishedPulling="2026-03-10 00:10:06.92122609 +0000 UTC m=+233.069121192" observedRunningTime="2026-03-10 00:10:07.10667475 +0000 UTC m=+233.254569862" watchObservedRunningTime="2026-03-10 00:10:07.108585194 +0000 UTC m=+233.256480306" Mar 10 00:10:07 crc kubenswrapper[4906]: I0310 00:10:07.132382 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nwlr8" podStartSLOduration=2.816210489 podStartE2EDuration="49.132363866s" podCreationTimestamp="2026-03-10 00:09:18 +0000 UTC" firstStartedPulling="2026-03-10 00:09:20.325790841 +0000 UTC m=+186.473685953" lastFinishedPulling="2026-03-10 00:10:06.641944218 +0000 UTC m=+232.789839330" observedRunningTime="2026-03-10 00:10:07.128311822 +0000 UTC m=+233.276206934" watchObservedRunningTime="2026-03-10 00:10:07.132363866 +0000 UTC m=+233.280258968" Mar 10 00:10:07 crc kubenswrapper[4906]: I0310 00:10:07.168058 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551688-fkkqj" podStartSLOduration=73.235720266 podStartE2EDuration="2m7.168032184s" podCreationTimestamp="2026-03-10 00:08:00 +0000 UTC" firstStartedPulling="2026-03-10 00:09:12.618960678 +0000 UTC m=+178.766855790" lastFinishedPulling="2026-03-10 00:10:06.551272596 +0000 UTC m=+232.699167708" observedRunningTime="2026-03-10 00:10:07.167377086 +0000 UTC m=+233.315272198" watchObservedRunningTime="2026-03-10 00:10:07.168032184 +0000 UTC m=+233.315927306" Mar 10 00:10:07 crc kubenswrapper[4906]: I0310 00:10:07.460757 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551690-8jrvf" Mar 10 00:10:07 crc kubenswrapper[4906]: I0310 00:10:07.545129 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv8jd\" (UniqueName: \"kubernetes.io/projected/d18b4f28-fc85-4aaf-af80-4272b80ef138-kube-api-access-pv8jd\") pod \"d18b4f28-fc85-4aaf-af80-4272b80ef138\" (UID: \"d18b4f28-fc85-4aaf-af80-4272b80ef138\") " Mar 10 00:10:07 crc kubenswrapper[4906]: I0310 00:10:07.552411 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d18b4f28-fc85-4aaf-af80-4272b80ef138-kube-api-access-pv8jd" (OuterVolumeSpecName: "kube-api-access-pv8jd") pod "d18b4f28-fc85-4aaf-af80-4272b80ef138" (UID: "d18b4f28-fc85-4aaf-af80-4272b80ef138"). InnerVolumeSpecName "kube-api-access-pv8jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:07 crc kubenswrapper[4906]: I0310 00:10:07.646517 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv8jd\" (UniqueName: \"kubernetes.io/projected/d18b4f28-fc85-4aaf-af80-4272b80ef138-kube-api-access-pv8jd\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:07 crc kubenswrapper[4906]: I0310 00:10:07.815127 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-04 19:50:06.489787665 +0000 UTC Mar 10 00:10:07 crc kubenswrapper[4906]: I0310 00:10:07.815168 4906 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6475h39m58.674622505s for next certificate rotation Mar 10 00:10:08 crc kubenswrapper[4906]: I0310 00:10:08.104423 4906 generic.go:334] "Generic (PLEG): container finished" podID="f1beb2f4-c1c5-488d-8c76-bed30174a0de" containerID="711228789c558ea0d1afc95a96bf9def92762e27d2713a45164c7f89b91178b2" exitCode=0 Mar 10 00:10:08 crc kubenswrapper[4906]: I0310 00:10:08.104494 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whmcg" event={"ID":"f1beb2f4-c1c5-488d-8c76-bed30174a0de","Type":"ContainerDied","Data":"711228789c558ea0d1afc95a96bf9def92762e27d2713a45164c7f89b91178b2"} Mar 10 00:10:08 crc kubenswrapper[4906]: I0310 00:10:08.108020 4906 generic.go:334] "Generic (PLEG): container finished" podID="dc4d2e8f-54ca-464b-b186-432747b22864" containerID="0a3bec7e1c35fcaae5318f83d776257e5b5d9ce3cb7961baf70b55652f175381" exitCode=0 Mar 10 00:10:08 crc kubenswrapper[4906]: I0310 00:10:08.108064 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvvhx" event={"ID":"dc4d2e8f-54ca-464b-b186-432747b22864","Type":"ContainerDied","Data":"0a3bec7e1c35fcaae5318f83d776257e5b5d9ce3cb7961baf70b55652f175381"} Mar 10 00:10:08 crc kubenswrapper[4906]: I0310 00:10:08.110739 4906 generic.go:334] "Generic (PLEG): container finished" podID="094c6270-b610-42c0-a6ce-3c146cb6bb6c" containerID="82c8cc3ef8c5b872fd0f4d97b18b2e2f95cbc6b451ee0407027a5c758f1f15d6" exitCode=0 Mar 10 00:10:08 crc kubenswrapper[4906]: I0310 00:10:08.110819 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551688-fkkqj" event={"ID":"094c6270-b610-42c0-a6ce-3c146cb6bb6c","Type":"ContainerDied","Data":"82c8cc3ef8c5b872fd0f4d97b18b2e2f95cbc6b451ee0407027a5c758f1f15d6"} Mar 10 00:10:08 crc kubenswrapper[4906]: I0310 00:10:08.116448 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551690-8jrvf" event={"ID":"d18b4f28-fc85-4aaf-af80-4272b80ef138","Type":"ContainerDied","Data":"bf193ee1b4c6517efc77e0801d0ddde269a8ced1f84b139735b55723970d2607"} Mar 10 00:10:08 crc kubenswrapper[4906]: I0310 00:10:08.116500 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf193ee1b4c6517efc77e0801d0ddde269a8ced1f84b139735b55723970d2607" Mar 10 00:10:08 crc kubenswrapper[4906]: I0310 00:10:08.116626 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551690-8jrvf" Mar 10 00:10:08 crc kubenswrapper[4906]: I0310 00:10:08.968269 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nwlr8" Mar 10 00:10:08 crc kubenswrapper[4906]: I0310 00:10:08.968650 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nwlr8" Mar 10 00:10:09 crc kubenswrapper[4906]: I0310 00:10:09.122248 4906 generic.go:334] "Generic (PLEG): container finished" podID="bfd0c098-c58b-456c-a9b2-270e749bc274" containerID="4ce3047f440ddec2444fb9a4f1de15726c662548dfbdcf77e785c010af32ce70" exitCode=0 Mar 10 00:10:09 crc kubenswrapper[4906]: I0310 00:10:09.122303 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbwl2" event={"ID":"bfd0c098-c58b-456c-a9b2-270e749bc274","Type":"ContainerDied","Data":"4ce3047f440ddec2444fb9a4f1de15726c662548dfbdcf77e785c010af32ce70"} Mar 10 00:10:09 crc kubenswrapper[4906]: I0310 00:10:09.125963 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whmcg" event={"ID":"f1beb2f4-c1c5-488d-8c76-bed30174a0de","Type":"ContainerStarted","Data":"c75b6e3143fcef88791c1dfe2ebcf30b7746c0039a8d03cf57565dae7345196a"} Mar 10 00:10:09 crc kubenswrapper[4906]: I0310 00:10:09.128700 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-js2xl" event={"ID":"9a68ed18-11e8-4943-854e-a8e4a5566313","Type":"ContainerStarted","Data":"3d04e14238006d327f4c8e7fa63ffef1ab20cd17825a700fd7454d53a5e55d8f"} Mar 10 00:10:09 crc kubenswrapper[4906]: I0310 00:10:09.172935 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-whmcg" podStartSLOduration=2.983892825 podStartE2EDuration="52.172917228s" podCreationTimestamp="2026-03-10 00:09:17 +0000 UTC" firstStartedPulling="2026-03-10 00:09:19.349113427 +0000 UTC m=+185.497008529" lastFinishedPulling="2026-03-10 00:10:08.53813782 +0000 UTC m=+234.686032932" observedRunningTime="2026-03-10 00:10:09.170773517 +0000 UTC m=+235.318668639" watchObservedRunningTime="2026-03-10 00:10:09.172917228 +0000 UTC m=+235.320812340" Mar 10 00:10:09 crc kubenswrapper[4906]: I0310 00:10:09.374957 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m5t8r" Mar 10 00:10:09 crc kubenswrapper[4906]: I0310 00:10:09.418996 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m5t8r" Mar 10 00:10:09 crc kubenswrapper[4906]: I0310 00:10:09.538807 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551688-fkkqj" Mar 10 00:10:09 crc kubenswrapper[4906]: I0310 00:10:09.569037 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nbzf\" (UniqueName: \"kubernetes.io/projected/094c6270-b610-42c0-a6ce-3c146cb6bb6c-kube-api-access-2nbzf\") pod \"094c6270-b610-42c0-a6ce-3c146cb6bb6c\" (UID: \"094c6270-b610-42c0-a6ce-3c146cb6bb6c\") " Mar 10 00:10:09 crc kubenswrapper[4906]: I0310 00:10:09.575337 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/094c6270-b610-42c0-a6ce-3c146cb6bb6c-kube-api-access-2nbzf" (OuterVolumeSpecName: "kube-api-access-2nbzf") pod "094c6270-b610-42c0-a6ce-3c146cb6bb6c" (UID: "094c6270-b610-42c0-a6ce-3c146cb6bb6c"). InnerVolumeSpecName "kube-api-access-2nbzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:09 crc kubenswrapper[4906]: I0310 00:10:09.670116 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nbzf\" (UniqueName: \"kubernetes.io/projected/094c6270-b610-42c0-a6ce-3c146cb6bb6c-kube-api-access-2nbzf\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:10 crc kubenswrapper[4906]: I0310 00:10:10.011302 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nwlr8" podUID="68c8b80a-0af0-46cb-8a57-a353444de9dc" containerName="registry-server" probeResult="failure" output=< Mar 10 00:10:10 crc kubenswrapper[4906]: timeout: failed to connect service ":50051" within 1s Mar 10 00:10:10 crc kubenswrapper[4906]: > Mar 10 00:10:10 crc kubenswrapper[4906]: I0310 00:10:10.138826 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551688-fkkqj" Mar 10 00:10:10 crc kubenswrapper[4906]: I0310 00:10:10.138839 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551688-fkkqj" event={"ID":"094c6270-b610-42c0-a6ce-3c146cb6bb6c","Type":"ContainerDied","Data":"702b2e5d0f0b233b4c7fb5bb04fc8b9918a7e64d152cad9375aa67adaa5802a1"} Mar 10 00:10:10 crc kubenswrapper[4906]: I0310 00:10:10.139266 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="702b2e5d0f0b233b4c7fb5bb04fc8b9918a7e64d152cad9375aa67adaa5802a1" Mar 10 00:10:10 crc kubenswrapper[4906]: I0310 00:10:10.142726 4906 generic.go:334] "Generic (PLEG): container finished" podID="9a68ed18-11e8-4943-854e-a8e4a5566313" containerID="3d04e14238006d327f4c8e7fa63ffef1ab20cd17825a700fd7454d53a5e55d8f" exitCode=0 Mar 10 00:10:10 crc kubenswrapper[4906]: I0310 00:10:10.142861 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-js2xl" event={"ID":"9a68ed18-11e8-4943-854e-a8e4a5566313","Type":"ContainerDied","Data":"3d04e14238006d327f4c8e7fa63ffef1ab20cd17825a700fd7454d53a5e55d8f"} Mar 10 00:10:10 crc kubenswrapper[4906]: I0310 00:10:10.146847 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbwl2" event={"ID":"bfd0c098-c58b-456c-a9b2-270e749bc274","Type":"ContainerStarted","Data":"62ac30d017905e7854001a3c88c6792379fd5521640721632c05fa39acfb3dfe"} Mar 10 00:10:10 crc kubenswrapper[4906]: I0310 00:10:10.212030 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jbwl2" podStartSLOduration=5.050380192 podStartE2EDuration="55.21200878s" podCreationTimestamp="2026-03-10 00:09:15 +0000 UTC" firstStartedPulling="2026-03-10 00:09:19.349913529 +0000 UTC m=+185.497808631" lastFinishedPulling="2026-03-10 00:10:09.511542107 +0000 UTC m=+235.659437219" observedRunningTime="2026-03-10 00:10:10.208535102 +0000 UTC m=+236.356430214" watchObservedRunningTime="2026-03-10 00:10:10.21200878 +0000 UTC m=+236.359903892" Mar 10 00:10:11 crc kubenswrapper[4906]: I0310 00:10:11.152759 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvvhx" event={"ID":"dc4d2e8f-54ca-464b-b186-432747b22864","Type":"ContainerStarted","Data":"d555d1b48bb1766a9f43cd2a2627345cd232c241457ed488eb1f7b70564cc834"} Mar 10 00:10:11 crc kubenswrapper[4906]: I0310 00:10:11.178123 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nvvhx" podStartSLOduration=4.767719155 podStartE2EDuration="56.17810513s" podCreationTimestamp="2026-03-10 00:09:15 +0000 UTC" firstStartedPulling="2026-03-10 00:09:19.350024172 +0000 UTC m=+185.497919284" lastFinishedPulling="2026-03-10 00:10:10.760410127 +0000 UTC m=+236.908305259" observedRunningTime="2026-03-10 00:10:11.172257844 +0000 UTC m=+237.320152966" watchObservedRunningTime="2026-03-10 00:10:11.17810513 +0000 UTC m=+237.326000252" Mar 10 00:10:12 crc kubenswrapper[4906]: I0310 00:10:12.163753 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-js2xl" event={"ID":"9a68ed18-11e8-4943-854e-a8e4a5566313","Type":"ContainerStarted","Data":"9b7af24bc713c60d61d293891a7c9cab0bd5629ac587ed75009c8e1827efa8b3"} Mar 10 00:10:12 crc kubenswrapper[4906]: I0310 00:10:12.188618 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-js2xl" podStartSLOduration=3.273893859 podStartE2EDuration="55.188602964s" podCreationTimestamp="2026-03-10 00:09:17 +0000 UTC" firstStartedPulling="2026-03-10 00:09:19.354614082 +0000 UTC m=+185.502509194" lastFinishedPulling="2026-03-10 00:10:11.269323187 +0000 UTC m=+237.417218299" observedRunningTime="2026-03-10 00:10:12.185181818 +0000 UTC m=+238.333076940" watchObservedRunningTime="2026-03-10 00:10:12.188602964 +0000 UTC m=+238.336498076" Mar 10 00:10:12 crc kubenswrapper[4906]: I0310 00:10:12.422821 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m5t8r"] Mar 10 00:10:12 crc kubenswrapper[4906]: I0310 00:10:12.423049 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m5t8r" podUID="947e7159-64b1-413f-8cee-daea0a8d0f3e" containerName="registry-server" containerID="cri-o://5e87f60c38a416b4361c3d348c8d7fd07655cfdd139585279fd93a787cdacb4c" gracePeriod=2 Mar 10 00:10:13 crc kubenswrapper[4906]: I0310 00:10:13.175257 4906 generic.go:334] "Generic (PLEG): container finished" podID="947e7159-64b1-413f-8cee-daea0a8d0f3e" containerID="5e87f60c38a416b4361c3d348c8d7fd07655cfdd139585279fd93a787cdacb4c" exitCode=0 Mar 10 00:10:13 crc kubenswrapper[4906]: I0310 00:10:13.175436 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5t8r" event={"ID":"947e7159-64b1-413f-8cee-daea0a8d0f3e","Type":"ContainerDied","Data":"5e87f60c38a416b4361c3d348c8d7fd07655cfdd139585279fd93a787cdacb4c"} Mar 10 00:10:13 crc kubenswrapper[4906]: I0310 00:10:13.504210 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5t8r" Mar 10 00:10:13 crc kubenswrapper[4906]: I0310 00:10:13.620586 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947e7159-64b1-413f-8cee-daea0a8d0f3e-catalog-content\") pod \"947e7159-64b1-413f-8cee-daea0a8d0f3e\" (UID: \"947e7159-64b1-413f-8cee-daea0a8d0f3e\") " Mar 10 00:10:13 crc kubenswrapper[4906]: I0310 00:10:13.620768 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm68j\" (UniqueName: \"kubernetes.io/projected/947e7159-64b1-413f-8cee-daea0a8d0f3e-kube-api-access-pm68j\") pod \"947e7159-64b1-413f-8cee-daea0a8d0f3e\" (UID: \"947e7159-64b1-413f-8cee-daea0a8d0f3e\") " Mar 10 00:10:13 crc kubenswrapper[4906]: I0310 00:10:13.620827 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947e7159-64b1-413f-8cee-daea0a8d0f3e-utilities\") pod \"947e7159-64b1-413f-8cee-daea0a8d0f3e\" (UID: \"947e7159-64b1-413f-8cee-daea0a8d0f3e\") " Mar 10 00:10:13 crc kubenswrapper[4906]: I0310 00:10:13.621795 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/947e7159-64b1-413f-8cee-daea0a8d0f3e-utilities" (OuterVolumeSpecName: "utilities") pod "947e7159-64b1-413f-8cee-daea0a8d0f3e" (UID: "947e7159-64b1-413f-8cee-daea0a8d0f3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:10:13 crc kubenswrapper[4906]: I0310 00:10:13.630932 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/947e7159-64b1-413f-8cee-daea0a8d0f3e-kube-api-access-pm68j" (OuterVolumeSpecName: "kube-api-access-pm68j") pod "947e7159-64b1-413f-8cee-daea0a8d0f3e" (UID: "947e7159-64b1-413f-8cee-daea0a8d0f3e"). InnerVolumeSpecName "kube-api-access-pm68j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:13 crc kubenswrapper[4906]: I0310 00:10:13.722963 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947e7159-64b1-413f-8cee-daea0a8d0f3e-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:13 crc kubenswrapper[4906]: I0310 00:10:13.723277 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm68j\" (UniqueName: \"kubernetes.io/projected/947e7159-64b1-413f-8cee-daea0a8d0f3e-kube-api-access-pm68j\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:13 crc kubenswrapper[4906]: I0310 00:10:13.772177 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/947e7159-64b1-413f-8cee-daea0a8d0f3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "947e7159-64b1-413f-8cee-daea0a8d0f3e" (UID: "947e7159-64b1-413f-8cee-daea0a8d0f3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:10:13 crc kubenswrapper[4906]: I0310 00:10:13.823995 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947e7159-64b1-413f-8cee-daea0a8d0f3e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:14 crc kubenswrapper[4906]: I0310 00:10:14.185749 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m5t8r" event={"ID":"947e7159-64b1-413f-8cee-daea0a8d0f3e","Type":"ContainerDied","Data":"11741743aaf42428c752a9681edeb285f380fce0e15722827ea4e2a47cc76a2f"} Mar 10 00:10:14 crc kubenswrapper[4906]: I0310 00:10:14.186202 4906 scope.go:117] "RemoveContainer" containerID="5e87f60c38a416b4361c3d348c8d7fd07655cfdd139585279fd93a787cdacb4c" Mar 10 00:10:14 crc kubenswrapper[4906]: I0310 00:10:14.185981 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m5t8r" Mar 10 00:10:14 crc kubenswrapper[4906]: I0310 00:10:14.209811 4906 scope.go:117] "RemoveContainer" containerID="710e1b4602077bc734ff3424a05e8f4b902cf0266148e992a74274a460cd25fc" Mar 10 00:10:14 crc kubenswrapper[4906]: I0310 00:10:14.238068 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m5t8r"] Mar 10 00:10:14 crc kubenswrapper[4906]: I0310 00:10:14.240503 4906 scope.go:117] "RemoveContainer" containerID="e861630965e8a71e53ab2e340a22c2329aaa7f93a29f8c2ff77e420b64a9f9fc" Mar 10 00:10:14 crc kubenswrapper[4906]: I0310 00:10:14.242030 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m5t8r"] Mar 10 00:10:14 crc kubenswrapper[4906]: I0310 00:10:14.585145 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="947e7159-64b1-413f-8cee-daea0a8d0f3e" path="/var/lib/kubelet/pods/947e7159-64b1-413f-8cee-daea0a8d0f3e/volumes" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.077311 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5868864b8c-z4kkl"] Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.077700 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" podUID="075a851f-d16a-43b4-8ced-44073d4f7810" containerName="controller-manager" containerID="cri-o://8412ad8d4ec241b26510eebc9ee336d39ac36878db237652761b29dfae86fc2b" gracePeriod=30 Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.124076 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j"] Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.124316 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" podUID="accac582-bb83-4d5c-ae2d-48797c5aeb03" containerName="route-controller-manager" containerID="cri-o://9fa9d9965f2af145a772981fd6397ed8132ab7e390f7e92d3ccbe88fb7afd3f9" gracePeriod=30 Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.621578 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.721650 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.778675 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddfnp\" (UniqueName: \"kubernetes.io/projected/075a851f-d16a-43b4-8ced-44073d4f7810-kube-api-access-ddfnp\") pod \"075a851f-d16a-43b4-8ced-44073d4f7810\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.778728 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-config\") pod \"075a851f-d16a-43b4-8ced-44073d4f7810\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.778748 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/accac582-bb83-4d5c-ae2d-48797c5aeb03-serving-cert\") pod \"accac582-bb83-4d5c-ae2d-48797c5aeb03\" (UID: \"accac582-bb83-4d5c-ae2d-48797c5aeb03\") " Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.778774 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/accac582-bb83-4d5c-ae2d-48797c5aeb03-client-ca\") pod \"accac582-bb83-4d5c-ae2d-48797c5aeb03\" (UID: \"accac582-bb83-4d5c-ae2d-48797c5aeb03\") " Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.778806 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-proxy-ca-bundles\") pod \"075a851f-d16a-43b4-8ced-44073d4f7810\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.778832 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-client-ca\") pod \"075a851f-d16a-43b4-8ced-44073d4f7810\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.778856 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/accac582-bb83-4d5c-ae2d-48797c5aeb03-config\") pod \"accac582-bb83-4d5c-ae2d-48797c5aeb03\" (UID: \"accac582-bb83-4d5c-ae2d-48797c5aeb03\") " Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.778877 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/075a851f-d16a-43b4-8ced-44073d4f7810-serving-cert\") pod \"075a851f-d16a-43b4-8ced-44073d4f7810\" (UID: \"075a851f-d16a-43b4-8ced-44073d4f7810\") " Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.778899 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8wh7\" (UniqueName: \"kubernetes.io/projected/accac582-bb83-4d5c-ae2d-48797c5aeb03-kube-api-access-h8wh7\") pod \"accac582-bb83-4d5c-ae2d-48797c5aeb03\" (UID: \"accac582-bb83-4d5c-ae2d-48797c5aeb03\") " Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.781441 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-client-ca" (OuterVolumeSpecName: "client-ca") pod "075a851f-d16a-43b4-8ced-44073d4f7810" (UID: "075a851f-d16a-43b4-8ced-44073d4f7810"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.782461 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/accac582-bb83-4d5c-ae2d-48797c5aeb03-client-ca" (OuterVolumeSpecName: "client-ca") pod "accac582-bb83-4d5c-ae2d-48797c5aeb03" (UID: "accac582-bb83-4d5c-ae2d-48797c5aeb03"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.782796 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-config" (OuterVolumeSpecName: "config") pod "075a851f-d16a-43b4-8ced-44073d4f7810" (UID: "075a851f-d16a-43b4-8ced-44073d4f7810"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.782919 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "075a851f-d16a-43b4-8ced-44073d4f7810" (UID: "075a851f-d16a-43b4-8ced-44073d4f7810"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.785135 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/accac582-bb83-4d5c-ae2d-48797c5aeb03-config" (OuterVolumeSpecName: "config") pod "accac582-bb83-4d5c-ae2d-48797c5aeb03" (UID: "accac582-bb83-4d5c-ae2d-48797c5aeb03"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.788279 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/075a851f-d16a-43b4-8ced-44073d4f7810-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "075a851f-d16a-43b4-8ced-44073d4f7810" (UID: "075a851f-d16a-43b4-8ced-44073d4f7810"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.788315 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/075a851f-d16a-43b4-8ced-44073d4f7810-kube-api-access-ddfnp" (OuterVolumeSpecName: "kube-api-access-ddfnp") pod "075a851f-d16a-43b4-8ced-44073d4f7810" (UID: "075a851f-d16a-43b4-8ced-44073d4f7810"). InnerVolumeSpecName "kube-api-access-ddfnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.788543 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/accac582-bb83-4d5c-ae2d-48797c5aeb03-kube-api-access-h8wh7" (OuterVolumeSpecName: "kube-api-access-h8wh7") pod "accac582-bb83-4d5c-ae2d-48797c5aeb03" (UID: "accac582-bb83-4d5c-ae2d-48797c5aeb03"). InnerVolumeSpecName "kube-api-access-h8wh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.797538 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/accac582-bb83-4d5c-ae2d-48797c5aeb03-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "accac582-bb83-4d5c-ae2d-48797c5aeb03" (UID: "accac582-bb83-4d5c-ae2d-48797c5aeb03"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.881030 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddfnp\" (UniqueName: \"kubernetes.io/projected/075a851f-d16a-43b4-8ced-44073d4f7810-kube-api-access-ddfnp\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.881088 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.881111 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/accac582-bb83-4d5c-ae2d-48797c5aeb03-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.881131 4906 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/accac582-bb83-4d5c-ae2d-48797c5aeb03-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.881147 4906 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.881168 4906 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/075a851f-d16a-43b4-8ced-44073d4f7810-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.881187 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/accac582-bb83-4d5c-ae2d-48797c5aeb03-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.881204 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/075a851f-d16a-43b4-8ced-44073d4f7810-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:15 crc kubenswrapper[4906]: I0310 00:10:15.881221 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8wh7\" (UniqueName: \"kubernetes.io/projected/accac582-bb83-4d5c-ae2d-48797c5aeb03-kube-api-access-h8wh7\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.215888 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76f87444f8-c5ww8"] Mar 10 00:10:16 crc kubenswrapper[4906]: E0310 00:10:16.217909 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="accac582-bb83-4d5c-ae2d-48797c5aeb03" containerName="route-controller-manager" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.217955 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="accac582-bb83-4d5c-ae2d-48797c5aeb03" containerName="route-controller-manager" Mar 10 00:10:16 crc kubenswrapper[4906]: E0310 00:10:16.217983 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="075a851f-d16a-43b4-8ced-44073d4f7810" containerName="controller-manager" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.217996 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="075a851f-d16a-43b4-8ced-44073d4f7810" containerName="controller-manager" Mar 10 00:10:16 crc kubenswrapper[4906]: E0310 00:10:16.218014 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947e7159-64b1-413f-8cee-daea0a8d0f3e" containerName="extract-content" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.218023 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="947e7159-64b1-413f-8cee-daea0a8d0f3e" containerName="extract-content" Mar 10 00:10:16 crc kubenswrapper[4906]: E0310 00:10:16.218034 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947e7159-64b1-413f-8cee-daea0a8d0f3e" containerName="registry-server" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.218041 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="947e7159-64b1-413f-8cee-daea0a8d0f3e" containerName="registry-server" Mar 10 00:10:16 crc kubenswrapper[4906]: E0310 00:10:16.218053 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="094c6270-b610-42c0-a6ce-3c146cb6bb6c" containerName="oc" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.218061 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="094c6270-b610-42c0-a6ce-3c146cb6bb6c" containerName="oc" Mar 10 00:10:16 crc kubenswrapper[4906]: E0310 00:10:16.218073 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d18b4f28-fc85-4aaf-af80-4272b80ef138" containerName="oc" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.218079 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18b4f28-fc85-4aaf-af80-4272b80ef138" containerName="oc" Mar 10 00:10:16 crc kubenswrapper[4906]: E0310 00:10:16.218097 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947e7159-64b1-413f-8cee-daea0a8d0f3e" containerName="extract-utilities" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.218109 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="947e7159-64b1-413f-8cee-daea0a8d0f3e" containerName="extract-utilities" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.218340 4906 generic.go:334] "Generic (PLEG): container finished" podID="accac582-bb83-4d5c-ae2d-48797c5aeb03" containerID="9fa9d9965f2af145a772981fd6397ed8132ab7e390f7e92d3ccbe88fb7afd3f9" exitCode=0 Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.218455 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="947e7159-64b1-413f-8cee-daea0a8d0f3e" containerName="registry-server" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.218471 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="075a851f-d16a-43b4-8ced-44073d4f7810" containerName="controller-manager" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.218480 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="d18b4f28-fc85-4aaf-af80-4272b80ef138" containerName="oc" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.218491 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="accac582-bb83-4d5c-ae2d-48797c5aeb03" containerName="route-controller-manager" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.218499 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="094c6270-b610-42c0-a6ce-3c146cb6bb6c" containerName="oc" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.218563 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.219084 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" event={"ID":"accac582-bb83-4d5c-ae2d-48797c5aeb03","Type":"ContainerDied","Data":"9fa9d9965f2af145a772981fd6397ed8132ab7e390f7e92d3ccbe88fb7afd3f9"} Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.219122 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j" event={"ID":"accac582-bb83-4d5c-ae2d-48797c5aeb03","Type":"ContainerDied","Data":"b41485d866be1c520b2ce6ad7c5846f48e5232b25e9b6ebe39e8f5ce2b6105bb"} Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.219148 4906 scope.go:117] "RemoveContainer" containerID="9fa9d9965f2af145a772981fd6397ed8132ab7e390f7e92d3ccbe88fb7afd3f9" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.219342 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.224493 4906 generic.go:334] "Generic (PLEG): container finished" podID="075a851f-d16a-43b4-8ced-44073d4f7810" containerID="8412ad8d4ec241b26510eebc9ee336d39ac36878db237652761b29dfae86fc2b" exitCode=0 Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.224536 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" event={"ID":"075a851f-d16a-43b4-8ced-44073d4f7810","Type":"ContainerDied","Data":"8412ad8d4ec241b26510eebc9ee336d39ac36878db237652761b29dfae86fc2b"} Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.224565 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" event={"ID":"075a851f-d16a-43b4-8ced-44073d4f7810","Type":"ContainerDied","Data":"a4f50e3865a4b10449ca40a6aebc57a8fd9cfdbf1616dd2a32fd96b271050dcb"} Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.224620 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5868864b8c-z4kkl" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.228973 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fhmhd" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.229027 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fhmhd" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.234692 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt"] Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.236695 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.241159 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76f87444f8-c5ww8"] Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.246324 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.247113 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.247677 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.247704 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.247850 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.247958 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.265704 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt"] Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.272862 4906 scope.go:117] "RemoveContainer" containerID="9fa9d9965f2af145a772981fd6397ed8132ab7e390f7e92d3ccbe88fb7afd3f9" Mar 10 00:10:16 crc kubenswrapper[4906]: E0310 00:10:16.273443 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fa9d9965f2af145a772981fd6397ed8132ab7e390f7e92d3ccbe88fb7afd3f9\": container with ID starting with 9fa9d9965f2af145a772981fd6397ed8132ab7e390f7e92d3ccbe88fb7afd3f9 not found: ID does not exist" containerID="9fa9d9965f2af145a772981fd6397ed8132ab7e390f7e92d3ccbe88fb7afd3f9" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.273486 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fa9d9965f2af145a772981fd6397ed8132ab7e390f7e92d3ccbe88fb7afd3f9"} err="failed to get container status \"9fa9d9965f2af145a772981fd6397ed8132ab7e390f7e92d3ccbe88fb7afd3f9\": rpc error: code = NotFound desc = could not find container \"9fa9d9965f2af145a772981fd6397ed8132ab7e390f7e92d3ccbe88fb7afd3f9\": container with ID starting with 9fa9d9965f2af145a772981fd6397ed8132ab7e390f7e92d3ccbe88fb7afd3f9 not found: ID does not exist" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.273511 4906 scope.go:117] "RemoveContainer" containerID="8412ad8d4ec241b26510eebc9ee336d39ac36878db237652761b29dfae86fc2b" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.286468 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-proxy-ca-bundles\") pod \"controller-manager-76f87444f8-c5ww8\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.286693 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/804ce14a-8267-4a2e-a90e-0572096372ef-serving-cert\") pod \"controller-manager-76f87444f8-c5ww8\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.286914 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5eb5696-e150-477c-b145-2ce5f4161410-serving-cert\") pod \"route-controller-manager-8bc597f48-dwzlt\" (UID: \"b5eb5696-e150-477c-b145-2ce5f4161410\") " pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.286949 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-config\") pod \"controller-manager-76f87444f8-c5ww8\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.287085 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdb6v\" (UniqueName: \"kubernetes.io/projected/804ce14a-8267-4a2e-a90e-0572096372ef-kube-api-access-fdb6v\") pod \"controller-manager-76f87444f8-c5ww8\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.287131 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5eb5696-e150-477c-b145-2ce5f4161410-config\") pod \"route-controller-manager-8bc597f48-dwzlt\" (UID: \"b5eb5696-e150-477c-b145-2ce5f4161410\") " pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.287276 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mw76\" (UniqueName: \"kubernetes.io/projected/b5eb5696-e150-477c-b145-2ce5f4161410-kube-api-access-2mw76\") pod \"route-controller-manager-8bc597f48-dwzlt\" (UID: \"b5eb5696-e150-477c-b145-2ce5f4161410\") " pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.287430 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5eb5696-e150-477c-b145-2ce5f4161410-client-ca\") pod \"route-controller-manager-8bc597f48-dwzlt\" (UID: \"b5eb5696-e150-477c-b145-2ce5f4161410\") " pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.287580 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-client-ca\") pod \"controller-manager-76f87444f8-c5ww8\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.299759 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5868864b8c-z4kkl"] Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.302971 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fhmhd" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.304338 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5868864b8c-z4kkl"] Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.315888 4906 scope.go:117] "RemoveContainer" containerID="8412ad8d4ec241b26510eebc9ee336d39ac36878db237652761b29dfae86fc2b" Mar 10 00:10:16 crc kubenswrapper[4906]: E0310 00:10:16.316618 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8412ad8d4ec241b26510eebc9ee336d39ac36878db237652761b29dfae86fc2b\": container with ID starting with 8412ad8d4ec241b26510eebc9ee336d39ac36878db237652761b29dfae86fc2b not found: ID does not exist" containerID="8412ad8d4ec241b26510eebc9ee336d39ac36878db237652761b29dfae86fc2b" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.316695 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8412ad8d4ec241b26510eebc9ee336d39ac36878db237652761b29dfae86fc2b"} err="failed to get container status \"8412ad8d4ec241b26510eebc9ee336d39ac36878db237652761b29dfae86fc2b\": rpc error: code = NotFound desc = could not find container \"8412ad8d4ec241b26510eebc9ee336d39ac36878db237652761b29dfae86fc2b\": container with ID starting with 8412ad8d4ec241b26510eebc9ee336d39ac36878db237652761b29dfae86fc2b not found: ID does not exist" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.316966 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j"] Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.322038 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d786d5d8c-6l67j"] Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.331965 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nvvhx" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.332075 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nvvhx" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.379783 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nvvhx" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.388442 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-proxy-ca-bundles\") pod \"controller-manager-76f87444f8-c5ww8\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.390237 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/804ce14a-8267-4a2e-a90e-0572096372ef-serving-cert\") pod \"controller-manager-76f87444f8-c5ww8\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.390300 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5eb5696-e150-477c-b145-2ce5f4161410-serving-cert\") pod \"route-controller-manager-8bc597f48-dwzlt\" (UID: \"b5eb5696-e150-477c-b145-2ce5f4161410\") " pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.390344 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-config\") pod \"controller-manager-76f87444f8-c5ww8\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.390369 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdb6v\" (UniqueName: \"kubernetes.io/projected/804ce14a-8267-4a2e-a90e-0572096372ef-kube-api-access-fdb6v\") pod \"controller-manager-76f87444f8-c5ww8\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.390416 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5eb5696-e150-477c-b145-2ce5f4161410-config\") pod \"route-controller-manager-8bc597f48-dwzlt\" (UID: \"b5eb5696-e150-477c-b145-2ce5f4161410\") " pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.390464 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mw76\" (UniqueName: \"kubernetes.io/projected/b5eb5696-e150-477c-b145-2ce5f4161410-kube-api-access-2mw76\") pod \"route-controller-manager-8bc597f48-dwzlt\" (UID: \"b5eb5696-e150-477c-b145-2ce5f4161410\") " pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.390548 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5eb5696-e150-477c-b145-2ce5f4161410-client-ca\") pod \"route-controller-manager-8bc597f48-dwzlt\" (UID: \"b5eb5696-e150-477c-b145-2ce5f4161410\") " pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.390597 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-client-ca\") pod \"controller-manager-76f87444f8-c5ww8\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.390666 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-proxy-ca-bundles\") pod \"controller-manager-76f87444f8-c5ww8\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.392225 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5eb5696-e150-477c-b145-2ce5f4161410-client-ca\") pod \"route-controller-manager-8bc597f48-dwzlt\" (UID: \"b5eb5696-e150-477c-b145-2ce5f4161410\") " pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.392247 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-client-ca\") pod \"controller-manager-76f87444f8-c5ww8\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.392402 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5eb5696-e150-477c-b145-2ce5f4161410-config\") pod \"route-controller-manager-8bc597f48-dwzlt\" (UID: \"b5eb5696-e150-477c-b145-2ce5f4161410\") " pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.393489 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-config\") pod \"controller-manager-76f87444f8-c5ww8\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.398585 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5eb5696-e150-477c-b145-2ce5f4161410-serving-cert\") pod \"route-controller-manager-8bc597f48-dwzlt\" (UID: \"b5eb5696-e150-477c-b145-2ce5f4161410\") " pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.399710 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/804ce14a-8267-4a2e-a90e-0572096372ef-serving-cert\") pod \"controller-manager-76f87444f8-c5ww8\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.420464 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdb6v\" (UniqueName: \"kubernetes.io/projected/804ce14a-8267-4a2e-a90e-0572096372ef-kube-api-access-fdb6v\") pod \"controller-manager-76f87444f8-c5ww8\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.426411 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mw76\" (UniqueName: \"kubernetes.io/projected/b5eb5696-e150-477c-b145-2ce5f4161410-kube-api-access-2mw76\") pod \"route-controller-manager-8bc597f48-dwzlt\" (UID: \"b5eb5696-e150-477c-b145-2ce5f4161410\") " pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.431057 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jbwl2" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.431094 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jbwl2" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.481171 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jbwl2" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.558555 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.568748 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.584438 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="075a851f-d16a-43b4-8ced-44073d4f7810" path="/var/lib/kubelet/pods/075a851f-d16a-43b4-8ced-44073d4f7810/volumes" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.584985 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="accac582-bb83-4d5c-ae2d-48797c5aeb03" path="/var/lib/kubelet/pods/accac582-bb83-4d5c-ae2d-48797c5aeb03/volumes" Mar 10 00:10:16 crc kubenswrapper[4906]: I0310 00:10:16.974944 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76f87444f8-c5ww8"] Mar 10 00:10:16 crc kubenswrapper[4906]: W0310 00:10:16.984425 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod804ce14a_8267_4a2e_a90e_0572096372ef.slice/crio-c16120d4680e094882c77f761c4ec89be844bb7473f75d0752e095499d1c1172 WatchSource:0}: Error finding container c16120d4680e094882c77f761c4ec89be844bb7473f75d0752e095499d1c1172: Status 404 returned error can't find the container with id c16120d4680e094882c77f761c4ec89be844bb7473f75d0752e095499d1c1172 Mar 10 00:10:17 crc kubenswrapper[4906]: I0310 00:10:17.053993 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt"] Mar 10 00:10:17 crc kubenswrapper[4906]: W0310 00:10:17.072146 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5eb5696_e150_477c_b145_2ce5f4161410.slice/crio-04888686209a042f3306462ed596a8c04bd7118f0a66b598ab193109f0faff58 WatchSource:0}: Error finding container 04888686209a042f3306462ed596a8c04bd7118f0a66b598ab193109f0faff58: Status 404 returned error can't find the container with id 04888686209a042f3306462ed596a8c04bd7118f0a66b598ab193109f0faff58 Mar 10 00:10:17 crc kubenswrapper[4906]: I0310 00:10:17.236120 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" event={"ID":"b5eb5696-e150-477c-b145-2ce5f4161410","Type":"ContainerStarted","Data":"04888686209a042f3306462ed596a8c04bd7118f0a66b598ab193109f0faff58"} Mar 10 00:10:17 crc kubenswrapper[4906]: I0310 00:10:17.239056 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" event={"ID":"804ce14a-8267-4a2e-a90e-0572096372ef","Type":"ContainerStarted","Data":"c16120d4680e094882c77f761c4ec89be844bb7473f75d0752e095499d1c1172"} Mar 10 00:10:17 crc kubenswrapper[4906]: I0310 00:10:17.287247 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fhmhd" Mar 10 00:10:17 crc kubenswrapper[4906]: I0310 00:10:17.290670 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nvvhx" Mar 10 00:10:17 crc kubenswrapper[4906]: I0310 00:10:17.303055 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jbwl2" Mar 10 00:10:17 crc kubenswrapper[4906]: I0310 00:10:17.944050 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-whmcg" Mar 10 00:10:17 crc kubenswrapper[4906]: I0310 00:10:17.944151 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-whmcg" Mar 10 00:10:17 crc kubenswrapper[4906]: I0310 00:10:17.994063 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-whmcg" Mar 10 00:10:18 crc kubenswrapper[4906]: I0310 00:10:18.253056 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" event={"ID":"b5eb5696-e150-477c-b145-2ce5f4161410","Type":"ContainerStarted","Data":"b5720c002243b3ca69d5c4d46d0d91442e2341bc795ff578132042a6ab360b9e"} Mar 10 00:10:18 crc kubenswrapper[4906]: I0310 00:10:18.253464 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:18 crc kubenswrapper[4906]: I0310 00:10:18.255106 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" event={"ID":"804ce14a-8267-4a2e-a90e-0572096372ef","Type":"ContainerStarted","Data":"fda7be75d8a2f8e53ba2b2d838e62068b033c991ed85bb33108deb3630c4a3c7"} Mar 10 00:10:18 crc kubenswrapper[4906]: I0310 00:10:18.261145 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:18 crc kubenswrapper[4906]: I0310 00:10:18.294045 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" podStartSLOduration=3.294020961 podStartE2EDuration="3.294020961s" podCreationTimestamp="2026-03-10 00:10:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:18.292155218 +0000 UTC m=+244.440050330" watchObservedRunningTime="2026-03-10 00:10:18.294020961 +0000 UTC m=+244.441916083" Mar 10 00:10:18 crc kubenswrapper[4906]: I0310 00:10:18.296127 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" podStartSLOduration=3.29611582 podStartE2EDuration="3.29611582s" podCreationTimestamp="2026-03-10 00:10:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:18.27701952 +0000 UTC m=+244.424914662" watchObservedRunningTime="2026-03-10 00:10:18.29611582 +0000 UTC m=+244.444010942" Mar 10 00:10:18 crc kubenswrapper[4906]: I0310 00:10:18.313431 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-whmcg" Mar 10 00:10:18 crc kubenswrapper[4906]: I0310 00:10:18.419681 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-js2xl" Mar 10 00:10:18 crc kubenswrapper[4906]: I0310 00:10:18.420234 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-js2xl" Mar 10 00:10:18 crc kubenswrapper[4906]: I0310 00:10:18.462853 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-js2xl" Mar 10 00:10:18 crc kubenswrapper[4906]: I0310 00:10:18.823124 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fhmhd"] Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.009322 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nwlr8" Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.052870 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nwlr8" Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.260954 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fhmhd" podUID="52694cc4-226b-4bd5-a6c7-0ebd711926e2" containerName="registry-server" containerID="cri-o://086773ee90732ab31fb14a62f923340b51b27dd62d3352d268d528ae89554642" gracePeriod=2 Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.261198 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.267676 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.316577 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-js2xl" Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.425412 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jbwl2"] Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.426011 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jbwl2" podUID="bfd0c098-c58b-456c-a9b2-270e749bc274" containerName="registry-server" containerID="cri-o://62ac30d017905e7854001a3c88c6792379fd5521640721632c05fa39acfb3dfe" gracePeriod=2 Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.724807 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhmhd" Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.846780 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52694cc4-226b-4bd5-a6c7-0ebd711926e2-catalog-content\") pod \"52694cc4-226b-4bd5-a6c7-0ebd711926e2\" (UID: \"52694cc4-226b-4bd5-a6c7-0ebd711926e2\") " Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.846890 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52694cc4-226b-4bd5-a6c7-0ebd711926e2-utilities\") pod \"52694cc4-226b-4bd5-a6c7-0ebd711926e2\" (UID: \"52694cc4-226b-4bd5-a6c7-0ebd711926e2\") " Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.846942 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxrkm\" (UniqueName: \"kubernetes.io/projected/52694cc4-226b-4bd5-a6c7-0ebd711926e2-kube-api-access-gxrkm\") pod \"52694cc4-226b-4bd5-a6c7-0ebd711926e2\" (UID: \"52694cc4-226b-4bd5-a6c7-0ebd711926e2\") " Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.847661 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52694cc4-226b-4bd5-a6c7-0ebd711926e2-utilities" (OuterVolumeSpecName: "utilities") pod "52694cc4-226b-4bd5-a6c7-0ebd711926e2" (UID: "52694cc4-226b-4bd5-a6c7-0ebd711926e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.852159 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52694cc4-226b-4bd5-a6c7-0ebd711926e2-kube-api-access-gxrkm" (OuterVolumeSpecName: "kube-api-access-gxrkm") pod "52694cc4-226b-4bd5-a6c7-0ebd711926e2" (UID: "52694cc4-226b-4bd5-a6c7-0ebd711926e2"). InnerVolumeSpecName "kube-api-access-gxrkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.888968 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbwl2" Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.897364 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52694cc4-226b-4bd5-a6c7-0ebd711926e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52694cc4-226b-4bd5-a6c7-0ebd711926e2" (UID: "52694cc4-226b-4bd5-a6c7-0ebd711926e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.948120 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52694cc4-226b-4bd5-a6c7-0ebd711926e2-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.948159 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxrkm\" (UniqueName: \"kubernetes.io/projected/52694cc4-226b-4bd5-a6c7-0ebd711926e2-kube-api-access-gxrkm\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:19 crc kubenswrapper[4906]: I0310 00:10:19.948175 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52694cc4-226b-4bd5-a6c7-0ebd711926e2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.049339 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfd0c098-c58b-456c-a9b2-270e749bc274-catalog-content\") pod \"bfd0c098-c58b-456c-a9b2-270e749bc274\" (UID: \"bfd0c098-c58b-456c-a9b2-270e749bc274\") " Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.049430 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfd0c098-c58b-456c-a9b2-270e749bc274-utilities\") pod \"bfd0c098-c58b-456c-a9b2-270e749bc274\" (UID: \"bfd0c098-c58b-456c-a9b2-270e749bc274\") " Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.049706 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8576\" (UniqueName: \"kubernetes.io/projected/bfd0c098-c58b-456c-a9b2-270e749bc274-kube-api-access-d8576\") pod \"bfd0c098-c58b-456c-a9b2-270e749bc274\" (UID: \"bfd0c098-c58b-456c-a9b2-270e749bc274\") " Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.050535 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfd0c098-c58b-456c-a9b2-270e749bc274-utilities" (OuterVolumeSpecName: "utilities") pod "bfd0c098-c58b-456c-a9b2-270e749bc274" (UID: "bfd0c098-c58b-456c-a9b2-270e749bc274"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.052423 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd0c098-c58b-456c-a9b2-270e749bc274-kube-api-access-d8576" (OuterVolumeSpecName: "kube-api-access-d8576") pod "bfd0c098-c58b-456c-a9b2-270e749bc274" (UID: "bfd0c098-c58b-456c-a9b2-270e749bc274"). InnerVolumeSpecName "kube-api-access-d8576". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.103562 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfd0c098-c58b-456c-a9b2-270e749bc274-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfd0c098-c58b-456c-a9b2-270e749bc274" (UID: "bfd0c098-c58b-456c-a9b2-270e749bc274"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.151447 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8576\" (UniqueName: \"kubernetes.io/projected/bfd0c098-c58b-456c-a9b2-270e749bc274-kube-api-access-d8576\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.151500 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfd0c098-c58b-456c-a9b2-270e749bc274-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.151520 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfd0c098-c58b-456c-a9b2-270e749bc274-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.270879 4906 generic.go:334] "Generic (PLEG): container finished" podID="bfd0c098-c58b-456c-a9b2-270e749bc274" containerID="62ac30d017905e7854001a3c88c6792379fd5521640721632c05fa39acfb3dfe" exitCode=0 Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.270957 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbwl2" event={"ID":"bfd0c098-c58b-456c-a9b2-270e749bc274","Type":"ContainerDied","Data":"62ac30d017905e7854001a3c88c6792379fd5521640721632c05fa39acfb3dfe"} Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.270986 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbwl2" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.271011 4906 scope.go:117] "RemoveContainer" containerID="62ac30d017905e7854001a3c88c6792379fd5521640721632c05fa39acfb3dfe" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.270997 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbwl2" event={"ID":"bfd0c098-c58b-456c-a9b2-270e749bc274","Type":"ContainerDied","Data":"feef254d9c7c35598955de9ae64d74d525c038cba7aaedbab6d167176c2d656a"} Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.274060 4906 generic.go:334] "Generic (PLEG): container finished" podID="52694cc4-226b-4bd5-a6c7-0ebd711926e2" containerID="086773ee90732ab31fb14a62f923340b51b27dd62d3352d268d528ae89554642" exitCode=0 Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.274117 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhmhd" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.274158 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhmhd" event={"ID":"52694cc4-226b-4bd5-a6c7-0ebd711926e2","Type":"ContainerDied","Data":"086773ee90732ab31fb14a62f923340b51b27dd62d3352d268d528ae89554642"} Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.274190 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhmhd" event={"ID":"52694cc4-226b-4bd5-a6c7-0ebd711926e2","Type":"ContainerDied","Data":"531e6a51b8153b56cfa30ec89f62b24a5d768da45e272b94e7f98afec3db106f"} Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.292291 4906 scope.go:117] "RemoveContainer" containerID="4ce3047f440ddec2444fb9a4f1de15726c662548dfbdcf77e785c010af32ce70" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.313190 4906 scope.go:117] "RemoveContainer" containerID="0b72a984e6ae9d267a0f0503a195eafcc0e1971c0a2345fe1abd9448f7f97767" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.313716 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jbwl2"] Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.322842 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jbwl2"] Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.327914 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fhmhd"] Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.332039 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fhmhd"] Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.335818 4906 scope.go:117] "RemoveContainer" containerID="62ac30d017905e7854001a3c88c6792379fd5521640721632c05fa39acfb3dfe" Mar 10 00:10:20 crc kubenswrapper[4906]: E0310 00:10:20.336350 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62ac30d017905e7854001a3c88c6792379fd5521640721632c05fa39acfb3dfe\": container with ID starting with 62ac30d017905e7854001a3c88c6792379fd5521640721632c05fa39acfb3dfe not found: ID does not exist" containerID="62ac30d017905e7854001a3c88c6792379fd5521640721632c05fa39acfb3dfe" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.336396 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62ac30d017905e7854001a3c88c6792379fd5521640721632c05fa39acfb3dfe"} err="failed to get container status \"62ac30d017905e7854001a3c88c6792379fd5521640721632c05fa39acfb3dfe\": rpc error: code = NotFound desc = could not find container \"62ac30d017905e7854001a3c88c6792379fd5521640721632c05fa39acfb3dfe\": container with ID starting with 62ac30d017905e7854001a3c88c6792379fd5521640721632c05fa39acfb3dfe not found: ID does not exist" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.336426 4906 scope.go:117] "RemoveContainer" containerID="4ce3047f440ddec2444fb9a4f1de15726c662548dfbdcf77e785c010af32ce70" Mar 10 00:10:20 crc kubenswrapper[4906]: E0310 00:10:20.336866 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ce3047f440ddec2444fb9a4f1de15726c662548dfbdcf77e785c010af32ce70\": container with ID starting with 4ce3047f440ddec2444fb9a4f1de15726c662548dfbdcf77e785c010af32ce70 not found: ID does not exist" containerID="4ce3047f440ddec2444fb9a4f1de15726c662548dfbdcf77e785c010af32ce70" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.336903 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce3047f440ddec2444fb9a4f1de15726c662548dfbdcf77e785c010af32ce70"} err="failed to get container status \"4ce3047f440ddec2444fb9a4f1de15726c662548dfbdcf77e785c010af32ce70\": rpc error: code = NotFound desc = could not find container \"4ce3047f440ddec2444fb9a4f1de15726c662548dfbdcf77e785c010af32ce70\": container with ID starting with 4ce3047f440ddec2444fb9a4f1de15726c662548dfbdcf77e785c010af32ce70 not found: ID does not exist" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.336925 4906 scope.go:117] "RemoveContainer" containerID="0b72a984e6ae9d267a0f0503a195eafcc0e1971c0a2345fe1abd9448f7f97767" Mar 10 00:10:20 crc kubenswrapper[4906]: E0310 00:10:20.337206 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b72a984e6ae9d267a0f0503a195eafcc0e1971c0a2345fe1abd9448f7f97767\": container with ID starting with 0b72a984e6ae9d267a0f0503a195eafcc0e1971c0a2345fe1abd9448f7f97767 not found: ID does not exist" containerID="0b72a984e6ae9d267a0f0503a195eafcc0e1971c0a2345fe1abd9448f7f97767" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.337265 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b72a984e6ae9d267a0f0503a195eafcc0e1971c0a2345fe1abd9448f7f97767"} err="failed to get container status \"0b72a984e6ae9d267a0f0503a195eafcc0e1971c0a2345fe1abd9448f7f97767\": rpc error: code = NotFound desc = could not find container \"0b72a984e6ae9d267a0f0503a195eafcc0e1971c0a2345fe1abd9448f7f97767\": container with ID starting with 0b72a984e6ae9d267a0f0503a195eafcc0e1971c0a2345fe1abd9448f7f97767 not found: ID does not exist" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.337287 4906 scope.go:117] "RemoveContainer" containerID="086773ee90732ab31fb14a62f923340b51b27dd62d3352d268d528ae89554642" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.352887 4906 scope.go:117] "RemoveContainer" containerID="3bb25fbb8b0ed4921199667d54f1c840740b41d8811ac53682480adc15a4743b" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.370337 4906 scope.go:117] "RemoveContainer" containerID="6cb827060cebc472bb56218c6c152feb359da7d8ec87352ef260da8e18bf5232" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.391621 4906 scope.go:117] "RemoveContainer" containerID="086773ee90732ab31fb14a62f923340b51b27dd62d3352d268d528ae89554642" Mar 10 00:10:20 crc kubenswrapper[4906]: E0310 00:10:20.392214 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"086773ee90732ab31fb14a62f923340b51b27dd62d3352d268d528ae89554642\": container with ID starting with 086773ee90732ab31fb14a62f923340b51b27dd62d3352d268d528ae89554642 not found: ID does not exist" containerID="086773ee90732ab31fb14a62f923340b51b27dd62d3352d268d528ae89554642" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.392250 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086773ee90732ab31fb14a62f923340b51b27dd62d3352d268d528ae89554642"} err="failed to get container status \"086773ee90732ab31fb14a62f923340b51b27dd62d3352d268d528ae89554642\": rpc error: code = NotFound desc = could not find container \"086773ee90732ab31fb14a62f923340b51b27dd62d3352d268d528ae89554642\": container with ID starting with 086773ee90732ab31fb14a62f923340b51b27dd62d3352d268d528ae89554642 not found: ID does not exist" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.392272 4906 scope.go:117] "RemoveContainer" containerID="3bb25fbb8b0ed4921199667d54f1c840740b41d8811ac53682480adc15a4743b" Mar 10 00:10:20 crc kubenswrapper[4906]: E0310 00:10:20.392686 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bb25fbb8b0ed4921199667d54f1c840740b41d8811ac53682480adc15a4743b\": container with ID starting with 3bb25fbb8b0ed4921199667d54f1c840740b41d8811ac53682480adc15a4743b not found: ID does not exist" containerID="3bb25fbb8b0ed4921199667d54f1c840740b41d8811ac53682480adc15a4743b" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.392764 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bb25fbb8b0ed4921199667d54f1c840740b41d8811ac53682480adc15a4743b"} err="failed to get container status \"3bb25fbb8b0ed4921199667d54f1c840740b41d8811ac53682480adc15a4743b\": rpc error: code = NotFound desc = could not find container \"3bb25fbb8b0ed4921199667d54f1c840740b41d8811ac53682480adc15a4743b\": container with ID starting with 3bb25fbb8b0ed4921199667d54f1c840740b41d8811ac53682480adc15a4743b not found: ID does not exist" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.392821 4906 scope.go:117] "RemoveContainer" containerID="6cb827060cebc472bb56218c6c152feb359da7d8ec87352ef260da8e18bf5232" Mar 10 00:10:20 crc kubenswrapper[4906]: E0310 00:10:20.393251 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cb827060cebc472bb56218c6c152feb359da7d8ec87352ef260da8e18bf5232\": container with ID starting with 6cb827060cebc472bb56218c6c152feb359da7d8ec87352ef260da8e18bf5232 not found: ID does not exist" containerID="6cb827060cebc472bb56218c6c152feb359da7d8ec87352ef260da8e18bf5232" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.393276 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cb827060cebc472bb56218c6c152feb359da7d8ec87352ef260da8e18bf5232"} err="failed to get container status \"6cb827060cebc472bb56218c6c152feb359da7d8ec87352ef260da8e18bf5232\": rpc error: code = NotFound desc = could not find container \"6cb827060cebc472bb56218c6c152feb359da7d8ec87352ef260da8e18bf5232\": container with ID starting with 6cb827060cebc472bb56218c6c152feb359da7d8ec87352ef260da8e18bf5232 not found: ID does not exist" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.597045 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52694cc4-226b-4bd5-a6c7-0ebd711926e2" path="/var/lib/kubelet/pods/52694cc4-226b-4bd5-a6c7-0ebd711926e2/volumes" Mar 10 00:10:20 crc kubenswrapper[4906]: I0310 00:10:20.598029 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfd0c098-c58b-456c-a9b2-270e749bc274" path="/var/lib/kubelet/pods/bfd0c098-c58b-456c-a9b2-270e749bc274/volumes" Mar 10 00:10:21 crc kubenswrapper[4906]: I0310 00:10:21.231950 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-js2xl"] Mar 10 00:10:21 crc kubenswrapper[4906]: I0310 00:10:21.285950 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-js2xl" podUID="9a68ed18-11e8-4943-854e-a8e4a5566313" containerName="registry-server" containerID="cri-o://9b7af24bc713c60d61d293891a7c9cab0bd5629ac587ed75009c8e1827efa8b3" gracePeriod=2 Mar 10 00:10:21 crc kubenswrapper[4906]: I0310 00:10:21.754305 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-js2xl" Mar 10 00:10:21 crc kubenswrapper[4906]: I0310 00:10:21.773344 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a68ed18-11e8-4943-854e-a8e4a5566313-utilities\") pod \"9a68ed18-11e8-4943-854e-a8e4a5566313\" (UID: \"9a68ed18-11e8-4943-854e-a8e4a5566313\") " Mar 10 00:10:21 crc kubenswrapper[4906]: I0310 00:10:21.773438 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w58s7\" (UniqueName: \"kubernetes.io/projected/9a68ed18-11e8-4943-854e-a8e4a5566313-kube-api-access-w58s7\") pod \"9a68ed18-11e8-4943-854e-a8e4a5566313\" (UID: \"9a68ed18-11e8-4943-854e-a8e4a5566313\") " Mar 10 00:10:21 crc kubenswrapper[4906]: I0310 00:10:21.775018 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a68ed18-11e8-4943-854e-a8e4a5566313-utilities" (OuterVolumeSpecName: "utilities") pod "9a68ed18-11e8-4943-854e-a8e4a5566313" (UID: "9a68ed18-11e8-4943-854e-a8e4a5566313"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:10:21 crc kubenswrapper[4906]: I0310 00:10:21.780809 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a68ed18-11e8-4943-854e-a8e4a5566313-kube-api-access-w58s7" (OuterVolumeSpecName: "kube-api-access-w58s7") pod "9a68ed18-11e8-4943-854e-a8e4a5566313" (UID: "9a68ed18-11e8-4943-854e-a8e4a5566313"). InnerVolumeSpecName "kube-api-access-w58s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:21 crc kubenswrapper[4906]: I0310 00:10:21.874246 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a68ed18-11e8-4943-854e-a8e4a5566313-catalog-content\") pod \"9a68ed18-11e8-4943-854e-a8e4a5566313\" (UID: \"9a68ed18-11e8-4943-854e-a8e4a5566313\") " Mar 10 00:10:21 crc kubenswrapper[4906]: I0310 00:10:21.875085 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w58s7\" (UniqueName: \"kubernetes.io/projected/9a68ed18-11e8-4943-854e-a8e4a5566313-kube-api-access-w58s7\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:21 crc kubenswrapper[4906]: I0310 00:10:21.875113 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a68ed18-11e8-4943-854e-a8e4a5566313-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:21 crc kubenswrapper[4906]: I0310 00:10:21.907104 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a68ed18-11e8-4943-854e-a8e4a5566313-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a68ed18-11e8-4943-854e-a8e4a5566313" (UID: "9a68ed18-11e8-4943-854e-a8e4a5566313"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:10:21 crc kubenswrapper[4906]: I0310 00:10:21.976292 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a68ed18-11e8-4943-854e-a8e4a5566313-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:22 crc kubenswrapper[4906]: I0310 00:10:22.293971 4906 generic.go:334] "Generic (PLEG): container finished" podID="9a68ed18-11e8-4943-854e-a8e4a5566313" containerID="9b7af24bc713c60d61d293891a7c9cab0bd5629ac587ed75009c8e1827efa8b3" exitCode=0 Mar 10 00:10:22 crc kubenswrapper[4906]: I0310 00:10:22.294016 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-js2xl" event={"ID":"9a68ed18-11e8-4943-854e-a8e4a5566313","Type":"ContainerDied","Data":"9b7af24bc713c60d61d293891a7c9cab0bd5629ac587ed75009c8e1827efa8b3"} Mar 10 00:10:22 crc kubenswrapper[4906]: I0310 00:10:22.294043 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-js2xl" event={"ID":"9a68ed18-11e8-4943-854e-a8e4a5566313","Type":"ContainerDied","Data":"85b6d97638460e38f7f947eb9e3112739f9710544346fb5ae9fe79d346b0c8f3"} Mar 10 00:10:22 crc kubenswrapper[4906]: I0310 00:10:22.294064 4906 scope.go:117] "RemoveContainer" containerID="9b7af24bc713c60d61d293891a7c9cab0bd5629ac587ed75009c8e1827efa8b3" Mar 10 00:10:22 crc kubenswrapper[4906]: I0310 00:10:22.294111 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-js2xl" Mar 10 00:10:22 crc kubenswrapper[4906]: I0310 00:10:22.308695 4906 scope.go:117] "RemoveContainer" containerID="3d04e14238006d327f4c8e7fa63ffef1ab20cd17825a700fd7454d53a5e55d8f" Mar 10 00:10:22 crc kubenswrapper[4906]: I0310 00:10:22.329091 4906 scope.go:117] "RemoveContainer" containerID="da54625627bec5526c7f69c77098a77afc9ad6db23f31a857a973002e5e69107" Mar 10 00:10:22 crc kubenswrapper[4906]: I0310 00:10:22.330883 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-js2xl"] Mar 10 00:10:22 crc kubenswrapper[4906]: I0310 00:10:22.334018 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-js2xl"] Mar 10 00:10:22 crc kubenswrapper[4906]: I0310 00:10:22.352498 4906 scope.go:117] "RemoveContainer" containerID="9b7af24bc713c60d61d293891a7c9cab0bd5629ac587ed75009c8e1827efa8b3" Mar 10 00:10:22 crc kubenswrapper[4906]: E0310 00:10:22.352963 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b7af24bc713c60d61d293891a7c9cab0bd5629ac587ed75009c8e1827efa8b3\": container with ID starting with 9b7af24bc713c60d61d293891a7c9cab0bd5629ac587ed75009c8e1827efa8b3 not found: ID does not exist" containerID="9b7af24bc713c60d61d293891a7c9cab0bd5629ac587ed75009c8e1827efa8b3" Mar 10 00:10:22 crc kubenswrapper[4906]: I0310 00:10:22.352992 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b7af24bc713c60d61d293891a7c9cab0bd5629ac587ed75009c8e1827efa8b3"} err="failed to get container status \"9b7af24bc713c60d61d293891a7c9cab0bd5629ac587ed75009c8e1827efa8b3\": rpc error: code = NotFound desc = could not find container \"9b7af24bc713c60d61d293891a7c9cab0bd5629ac587ed75009c8e1827efa8b3\": container with ID starting with 9b7af24bc713c60d61d293891a7c9cab0bd5629ac587ed75009c8e1827efa8b3 not found: ID does not exist" Mar 10 00:10:22 crc kubenswrapper[4906]: I0310 00:10:22.353009 4906 scope.go:117] "RemoveContainer" containerID="3d04e14238006d327f4c8e7fa63ffef1ab20cd17825a700fd7454d53a5e55d8f" Mar 10 00:10:22 crc kubenswrapper[4906]: E0310 00:10:22.353305 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d04e14238006d327f4c8e7fa63ffef1ab20cd17825a700fd7454d53a5e55d8f\": container with ID starting with 3d04e14238006d327f4c8e7fa63ffef1ab20cd17825a700fd7454d53a5e55d8f not found: ID does not exist" containerID="3d04e14238006d327f4c8e7fa63ffef1ab20cd17825a700fd7454d53a5e55d8f" Mar 10 00:10:22 crc kubenswrapper[4906]: I0310 00:10:22.353321 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d04e14238006d327f4c8e7fa63ffef1ab20cd17825a700fd7454d53a5e55d8f"} err="failed to get container status \"3d04e14238006d327f4c8e7fa63ffef1ab20cd17825a700fd7454d53a5e55d8f\": rpc error: code = NotFound desc = could not find container \"3d04e14238006d327f4c8e7fa63ffef1ab20cd17825a700fd7454d53a5e55d8f\": container with ID starting with 3d04e14238006d327f4c8e7fa63ffef1ab20cd17825a700fd7454d53a5e55d8f not found: ID does not exist" Mar 10 00:10:22 crc kubenswrapper[4906]: I0310 00:10:22.353333 4906 scope.go:117] "RemoveContainer" containerID="da54625627bec5526c7f69c77098a77afc9ad6db23f31a857a973002e5e69107" Mar 10 00:10:22 crc kubenswrapper[4906]: E0310 00:10:22.353732 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da54625627bec5526c7f69c77098a77afc9ad6db23f31a857a973002e5e69107\": container with ID starting with da54625627bec5526c7f69c77098a77afc9ad6db23f31a857a973002e5e69107 not found: ID does not exist" containerID="da54625627bec5526c7f69c77098a77afc9ad6db23f31a857a973002e5e69107" Mar 10 00:10:22 crc kubenswrapper[4906]: I0310 00:10:22.353782 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da54625627bec5526c7f69c77098a77afc9ad6db23f31a857a973002e5e69107"} err="failed to get container status \"da54625627bec5526c7f69c77098a77afc9ad6db23f31a857a973002e5e69107\": rpc error: code = NotFound desc = could not find container \"da54625627bec5526c7f69c77098a77afc9ad6db23f31a857a973002e5e69107\": container with ID starting with da54625627bec5526c7f69c77098a77afc9ad6db23f31a857a973002e5e69107 not found: ID does not exist" Mar 10 00:10:22 crc kubenswrapper[4906]: I0310 00:10:22.582630 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a68ed18-11e8-4943-854e-a8e4a5566313" path="/var/lib/kubelet/pods/9a68ed18-11e8-4943-854e-a8e4a5566313/volumes" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.118463 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-mq564" podUID="4b4509ca-5d20-4f5c-89ea-a910f792ff82" containerName="oauth-openshift" containerID="cri-o://1b21cc89a8de7a59f495e570312062f232cb09206d98ad3e7b913ac58c3590b6" gracePeriod=15 Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.308423 4906 generic.go:334] "Generic (PLEG): container finished" podID="4b4509ca-5d20-4f5c-89ea-a910f792ff82" containerID="1b21cc89a8de7a59f495e570312062f232cb09206d98ad3e7b913ac58c3590b6" exitCode=0 Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.308699 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mq564" event={"ID":"4b4509ca-5d20-4f5c-89ea-a910f792ff82","Type":"ContainerDied","Data":"1b21cc89a8de7a59f495e570312062f232cb09206d98ad3e7b913ac58c3590b6"} Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.571274 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.615229 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-audit-policies\") pod \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.615273 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-router-certs\") pod \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.615307 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-cliconfig\") pod \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.616093 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "4b4509ca-5d20-4f5c-89ea-a910f792ff82" (UID: "4b4509ca-5d20-4f5c-89ea-a910f792ff82"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.615339 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-login\") pod \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.616175 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjdx4\" (UniqueName: \"kubernetes.io/projected/4b4509ca-5d20-4f5c-89ea-a910f792ff82-kube-api-access-vjdx4\") pod \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.616165 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "4b4509ca-5d20-4f5c-89ea-a910f792ff82" (UID: "4b4509ca-5d20-4f5c-89ea-a910f792ff82"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.616201 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-serving-cert\") pod \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.616240 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-ocp-branding-template\") pod \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.616258 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b4509ca-5d20-4f5c-89ea-a910f792ff82-audit-dir\") pod \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.616406 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b4509ca-5d20-4f5c-89ea-a910f792ff82-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4b4509ca-5d20-4f5c-89ea-a910f792ff82" (UID: "4b4509ca-5d20-4f5c-89ea-a910f792ff82"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.616546 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-provider-selection\") pod \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.616612 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-session\") pod \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.616645 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-service-ca\") pod \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.616669 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-error\") pod \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.616690 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-trusted-ca-bundle\") pod \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.616709 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-idp-0-file-data\") pod \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\" (UID: \"4b4509ca-5d20-4f5c-89ea-a910f792ff82\") " Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.617037 4906 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.617056 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.617066 4906 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b4509ca-5d20-4f5c-89ea-a910f792ff82-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.624105 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "4b4509ca-5d20-4f5c-89ea-a910f792ff82" (UID: "4b4509ca-5d20-4f5c-89ea-a910f792ff82"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.624364 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "4b4509ca-5d20-4f5c-89ea-a910f792ff82" (UID: "4b4509ca-5d20-4f5c-89ea-a910f792ff82"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.629788 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "4b4509ca-5d20-4f5c-89ea-a910f792ff82" (UID: "4b4509ca-5d20-4f5c-89ea-a910f792ff82"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.642955 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "4b4509ca-5d20-4f5c-89ea-a910f792ff82" (UID: "4b4509ca-5d20-4f5c-89ea-a910f792ff82"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.643024 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b4509ca-5d20-4f5c-89ea-a910f792ff82-kube-api-access-vjdx4" (OuterVolumeSpecName: "kube-api-access-vjdx4") pod "4b4509ca-5d20-4f5c-89ea-a910f792ff82" (UID: "4b4509ca-5d20-4f5c-89ea-a910f792ff82"). InnerVolumeSpecName "kube-api-access-vjdx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.655204 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "4b4509ca-5d20-4f5c-89ea-a910f792ff82" (UID: "4b4509ca-5d20-4f5c-89ea-a910f792ff82"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.656574 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "4b4509ca-5d20-4f5c-89ea-a910f792ff82" (UID: "4b4509ca-5d20-4f5c-89ea-a910f792ff82"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.657119 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "4b4509ca-5d20-4f5c-89ea-a910f792ff82" (UID: "4b4509ca-5d20-4f5c-89ea-a910f792ff82"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.659088 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "4b4509ca-5d20-4f5c-89ea-a910f792ff82" (UID: "4b4509ca-5d20-4f5c-89ea-a910f792ff82"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.659286 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "4b4509ca-5d20-4f5c-89ea-a910f792ff82" (UID: "4b4509ca-5d20-4f5c-89ea-a910f792ff82"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.660888 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "4b4509ca-5d20-4f5c-89ea-a910f792ff82" (UID: "4b4509ca-5d20-4f5c-89ea-a910f792ff82"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.718383 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.718442 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjdx4\" (UniqueName: \"kubernetes.io/projected/4b4509ca-5d20-4f5c-89ea-a910f792ff82-kube-api-access-vjdx4\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.718455 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.718474 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.718485 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.718497 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.718506 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.718519 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.718527 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.718537 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:24 crc kubenswrapper[4906]: I0310 00:10:24.718547 4906 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4b4509ca-5d20-4f5c-89ea-a910f792ff82-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:25 crc kubenswrapper[4906]: I0310 00:10:25.315614 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mq564" event={"ID":"4b4509ca-5d20-4f5c-89ea-a910f792ff82","Type":"ContainerDied","Data":"307cad07582259a45f6385c96a34fdcb026d172826af0c76f2444a424de835b0"} Mar 10 00:10:25 crc kubenswrapper[4906]: I0310 00:10:25.315697 4906 scope.go:117] "RemoveContainer" containerID="1b21cc89a8de7a59f495e570312062f232cb09206d98ad3e7b913ac58c3590b6" Mar 10 00:10:25 crc kubenswrapper[4906]: I0310 00:10:25.316506 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mq564" Mar 10 00:10:25 crc kubenswrapper[4906]: I0310 00:10:25.346498 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mq564"] Mar 10 00:10:25 crc kubenswrapper[4906]: I0310 00:10:25.350397 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mq564"] Mar 10 00:10:26 crc kubenswrapper[4906]: I0310 00:10:26.583818 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b4509ca-5d20-4f5c-89ea-a910f792ff82" path="/var/lib/kubelet/pods/4b4509ca-5d20-4f5c-89ea-a910f792ff82/volumes" Mar 10 00:10:30 crc kubenswrapper[4906]: I0310 00:10:30.502887 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:10:30 crc kubenswrapper[4906]: I0310 00:10:30.503223 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.234592 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-68974c876c-p4n54"] Mar 10 00:10:32 crc kubenswrapper[4906]: E0310 00:10:32.235231 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a68ed18-11e8-4943-854e-a8e4a5566313" containerName="registry-server" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.235249 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a68ed18-11e8-4943-854e-a8e4a5566313" containerName="registry-server" Mar 10 00:10:32 crc kubenswrapper[4906]: E0310 00:10:32.235265 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd0c098-c58b-456c-a9b2-270e749bc274" containerName="extract-utilities" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.235274 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd0c098-c58b-456c-a9b2-270e749bc274" containerName="extract-utilities" Mar 10 00:10:32 crc kubenswrapper[4906]: E0310 00:10:32.235285 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52694cc4-226b-4bd5-a6c7-0ebd711926e2" containerName="extract-content" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.235294 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="52694cc4-226b-4bd5-a6c7-0ebd711926e2" containerName="extract-content" Mar 10 00:10:32 crc kubenswrapper[4906]: E0310 00:10:32.235306 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52694cc4-226b-4bd5-a6c7-0ebd711926e2" containerName="registry-server" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.235314 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="52694cc4-226b-4bd5-a6c7-0ebd711926e2" containerName="registry-server" Mar 10 00:10:32 crc kubenswrapper[4906]: E0310 00:10:32.235322 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a68ed18-11e8-4943-854e-a8e4a5566313" containerName="extract-content" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.235329 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a68ed18-11e8-4943-854e-a8e4a5566313" containerName="extract-content" Mar 10 00:10:32 crc kubenswrapper[4906]: E0310 00:10:32.235340 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52694cc4-226b-4bd5-a6c7-0ebd711926e2" containerName="extract-utilities" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.235349 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="52694cc4-226b-4bd5-a6c7-0ebd711926e2" containerName="extract-utilities" Mar 10 00:10:32 crc kubenswrapper[4906]: E0310 00:10:32.235359 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a68ed18-11e8-4943-854e-a8e4a5566313" containerName="extract-utilities" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.235367 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a68ed18-11e8-4943-854e-a8e4a5566313" containerName="extract-utilities" Mar 10 00:10:32 crc kubenswrapper[4906]: E0310 00:10:32.235376 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b4509ca-5d20-4f5c-89ea-a910f792ff82" containerName="oauth-openshift" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.235384 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b4509ca-5d20-4f5c-89ea-a910f792ff82" containerName="oauth-openshift" Mar 10 00:10:32 crc kubenswrapper[4906]: E0310 00:10:32.235393 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd0c098-c58b-456c-a9b2-270e749bc274" containerName="extract-content" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.235401 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd0c098-c58b-456c-a9b2-270e749bc274" containerName="extract-content" Mar 10 00:10:32 crc kubenswrapper[4906]: E0310 00:10:32.235412 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd0c098-c58b-456c-a9b2-270e749bc274" containerName="registry-server" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.235419 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd0c098-c58b-456c-a9b2-270e749bc274" containerName="registry-server" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.235535 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="52694cc4-226b-4bd5-a6c7-0ebd711926e2" containerName="registry-server" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.235549 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd0c098-c58b-456c-a9b2-270e749bc274" containerName="registry-server" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.235562 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a68ed18-11e8-4943-854e-a8e4a5566313" containerName="registry-server" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.235577 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b4509ca-5d20-4f5c-89ea-a910f792ff82" containerName="oauth-openshift" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.236080 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.238284 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.239740 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.240011 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.240196 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.240597 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.240793 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.240859 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.240959 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.241077 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.241322 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.242010 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.250205 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.292668 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68974c876c-p4n54"] Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.302972 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.310655 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-user-template-error\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.310781 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.310836 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwclh\" (UniqueName: \"kubernetes.io/projected/d554f4b5-37ad-48e1-9891-18941779a5f0-kube-api-access-vwclh\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.310867 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.310894 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.311122 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.311160 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-session\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.311186 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-service-ca\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.311213 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.311255 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-user-template-login\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.311355 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d554f4b5-37ad-48e1-9891-18941779a5f0-audit-dir\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.311416 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d554f4b5-37ad-48e1-9891-18941779a5f0-audit-policies\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.311537 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-router-certs\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.311570 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.311616 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.316729 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.413236 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.413298 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-session\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.413321 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-service-ca\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.413350 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.413377 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-user-template-login\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.413402 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d554f4b5-37ad-48e1-9891-18941779a5f0-audit-dir\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.413426 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d554f4b5-37ad-48e1-9891-18941779a5f0-audit-policies\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.413468 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-router-certs\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.413495 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.413547 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.413581 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-user-template-error\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.413627 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.413662 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwclh\" (UniqueName: \"kubernetes.io/projected/d554f4b5-37ad-48e1-9891-18941779a5f0-kube-api-access-vwclh\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.413687 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.413683 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d554f4b5-37ad-48e1-9891-18941779a5f0-audit-dir\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.415539 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d554f4b5-37ad-48e1-9891-18941779a5f0-audit-policies\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.415716 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.416097 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-service-ca\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.416299 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.418534 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.418708 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.418821 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-session\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.419206 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.419353 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-user-template-error\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.420164 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.421896 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-system-router-certs\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.428533 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d554f4b5-37ad-48e1-9891-18941779a5f0-v4-0-config-user-template-login\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.429475 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwclh\" (UniqueName: \"kubernetes.io/projected/d554f4b5-37ad-48e1-9891-18941779a5f0-kube-api-access-vwclh\") pod \"oauth-openshift-68974c876c-p4n54\" (UID: \"d554f4b5-37ad-48e1-9891-18941779a5f0\") " pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.609601 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:32 crc kubenswrapper[4906]: I0310 00:10:32.747740 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 00:10:33 crc kubenswrapper[4906]: I0310 00:10:33.000537 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68974c876c-p4n54"] Mar 10 00:10:33 crc kubenswrapper[4906]: W0310 00:10:33.006455 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd554f4b5_37ad_48e1_9891_18941779a5f0.slice/crio-b52380b101d813fc55ca20f133cf30192e9abb861378d3b7e35eeb1027127ef3 WatchSource:0}: Error finding container b52380b101d813fc55ca20f133cf30192e9abb861378d3b7e35eeb1027127ef3: Status 404 returned error can't find the container with id b52380b101d813fc55ca20f133cf30192e9abb861378d3b7e35eeb1027127ef3 Mar 10 00:10:33 crc kubenswrapper[4906]: I0310 00:10:33.355875 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" event={"ID":"d554f4b5-37ad-48e1-9891-18941779a5f0","Type":"ContainerStarted","Data":"54bace6b565d727e480b3b7d950d5bd2488bf15a9db94b5b594a853a5eb1cdf0"} Mar 10 00:10:33 crc kubenswrapper[4906]: I0310 00:10:33.355936 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" event={"ID":"d554f4b5-37ad-48e1-9891-18941779a5f0","Type":"ContainerStarted","Data":"b52380b101d813fc55ca20f133cf30192e9abb861378d3b7e35eeb1027127ef3"} Mar 10 00:10:33 crc kubenswrapper[4906]: I0310 00:10:33.357537 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:33 crc kubenswrapper[4906]: I0310 00:10:33.381957 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" podStartSLOduration=34.381937974 podStartE2EDuration="34.381937974s" podCreationTimestamp="2026-03-10 00:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:33.379228618 +0000 UTC m=+259.527123750" watchObservedRunningTime="2026-03-10 00:10:33.381937974 +0000 UTC m=+259.529833096" Mar 10 00:10:33 crc kubenswrapper[4906]: I0310 00:10:33.549104 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-68974c876c-p4n54" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.104915 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76f87444f8-c5ww8"] Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.105233 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" podUID="804ce14a-8267-4a2e-a90e-0572096372ef" containerName="controller-manager" containerID="cri-o://fda7be75d8a2f8e53ba2b2d838e62068b033c991ed85bb33108deb3630c4a3c7" gracePeriod=30 Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.190297 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt"] Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.190617 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" podUID="b5eb5696-e150-477c-b145-2ce5f4161410" containerName="route-controller-manager" containerID="cri-o://b5720c002243b3ca69d5c4d46d0d91442e2341bc795ff578132042a6ab360b9e" gracePeriod=30 Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.375309 4906 generic.go:334] "Generic (PLEG): container finished" podID="b5eb5696-e150-477c-b145-2ce5f4161410" containerID="b5720c002243b3ca69d5c4d46d0d91442e2341bc795ff578132042a6ab360b9e" exitCode=0 Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.376348 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" event={"ID":"b5eb5696-e150-477c-b145-2ce5f4161410","Type":"ContainerDied","Data":"b5720c002243b3ca69d5c4d46d0d91442e2341bc795ff578132042a6ab360b9e"} Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.378661 4906 generic.go:334] "Generic (PLEG): container finished" podID="804ce14a-8267-4a2e-a90e-0572096372ef" containerID="fda7be75d8a2f8e53ba2b2d838e62068b033c991ed85bb33108deb3630c4a3c7" exitCode=0 Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.378743 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" event={"ID":"804ce14a-8267-4a2e-a90e-0572096372ef","Type":"ContainerDied","Data":"fda7be75d8a2f8e53ba2b2d838e62068b033c991ed85bb33108deb3630c4a3c7"} Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.729009 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.733165 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.873230 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5eb5696-e150-477c-b145-2ce5f4161410-client-ca\") pod \"b5eb5696-e150-477c-b145-2ce5f4161410\" (UID: \"b5eb5696-e150-477c-b145-2ce5f4161410\") " Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.873325 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5eb5696-e150-477c-b145-2ce5f4161410-config\") pod \"b5eb5696-e150-477c-b145-2ce5f4161410\" (UID: \"b5eb5696-e150-477c-b145-2ce5f4161410\") " Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.873371 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mw76\" (UniqueName: \"kubernetes.io/projected/b5eb5696-e150-477c-b145-2ce5f4161410-kube-api-access-2mw76\") pod \"b5eb5696-e150-477c-b145-2ce5f4161410\" (UID: \"b5eb5696-e150-477c-b145-2ce5f4161410\") " Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.873411 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-client-ca\") pod \"804ce14a-8267-4a2e-a90e-0572096372ef\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.873460 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-config\") pod \"804ce14a-8267-4a2e-a90e-0572096372ef\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.873486 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5eb5696-e150-477c-b145-2ce5f4161410-serving-cert\") pod \"b5eb5696-e150-477c-b145-2ce5f4161410\" (UID: \"b5eb5696-e150-477c-b145-2ce5f4161410\") " Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.873506 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdb6v\" (UniqueName: \"kubernetes.io/projected/804ce14a-8267-4a2e-a90e-0572096372ef-kube-api-access-fdb6v\") pod \"804ce14a-8267-4a2e-a90e-0572096372ef\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.873534 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-proxy-ca-bundles\") pod \"804ce14a-8267-4a2e-a90e-0572096372ef\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.873553 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/804ce14a-8267-4a2e-a90e-0572096372ef-serving-cert\") pod \"804ce14a-8267-4a2e-a90e-0572096372ef\" (UID: \"804ce14a-8267-4a2e-a90e-0572096372ef\") " Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.874159 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5eb5696-e150-477c-b145-2ce5f4161410-client-ca" (OuterVolumeSpecName: "client-ca") pod "b5eb5696-e150-477c-b145-2ce5f4161410" (UID: "b5eb5696-e150-477c-b145-2ce5f4161410"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.874245 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5eb5696-e150-477c-b145-2ce5f4161410-config" (OuterVolumeSpecName: "config") pod "b5eb5696-e150-477c-b145-2ce5f4161410" (UID: "b5eb5696-e150-477c-b145-2ce5f4161410"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.874529 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "804ce14a-8267-4a2e-a90e-0572096372ef" (UID: "804ce14a-8267-4a2e-a90e-0572096372ef"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.874543 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-client-ca" (OuterVolumeSpecName: "client-ca") pod "804ce14a-8267-4a2e-a90e-0572096372ef" (UID: "804ce14a-8267-4a2e-a90e-0572096372ef"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.874595 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-config" (OuterVolumeSpecName: "config") pod "804ce14a-8267-4a2e-a90e-0572096372ef" (UID: "804ce14a-8267-4a2e-a90e-0572096372ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.874900 4906 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5eb5696-e150-477c-b145-2ce5f4161410-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.874919 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5eb5696-e150-477c-b145-2ce5f4161410-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.874928 4906 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.874937 4906 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.874945 4906 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/804ce14a-8267-4a2e-a90e-0572096372ef-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.879603 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5eb5696-e150-477c-b145-2ce5f4161410-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b5eb5696-e150-477c-b145-2ce5f4161410" (UID: "b5eb5696-e150-477c-b145-2ce5f4161410"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.879987 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5eb5696-e150-477c-b145-2ce5f4161410-kube-api-access-2mw76" (OuterVolumeSpecName: "kube-api-access-2mw76") pod "b5eb5696-e150-477c-b145-2ce5f4161410" (UID: "b5eb5696-e150-477c-b145-2ce5f4161410"). InnerVolumeSpecName "kube-api-access-2mw76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.880092 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804ce14a-8267-4a2e-a90e-0572096372ef-kube-api-access-fdb6v" (OuterVolumeSpecName: "kube-api-access-fdb6v") pod "804ce14a-8267-4a2e-a90e-0572096372ef" (UID: "804ce14a-8267-4a2e-a90e-0572096372ef"). InnerVolumeSpecName "kube-api-access-fdb6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.880717 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804ce14a-8267-4a2e-a90e-0572096372ef-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "804ce14a-8267-4a2e-a90e-0572096372ef" (UID: "804ce14a-8267-4a2e-a90e-0572096372ef"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.976959 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/804ce14a-8267-4a2e-a90e-0572096372ef-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.977010 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mw76\" (UniqueName: \"kubernetes.io/projected/b5eb5696-e150-477c-b145-2ce5f4161410-kube-api-access-2mw76\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.977023 4906 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5eb5696-e150-477c-b145-2ce5f4161410-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:35 crc kubenswrapper[4906]: I0310 00:10:35.977033 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdb6v\" (UniqueName: \"kubernetes.io/projected/804ce14a-8267-4a2e-a90e-0572096372ef-kube-api-access-fdb6v\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.239601 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr"] Mar 10 00:10:36 crc kubenswrapper[4906]: E0310 00:10:36.240004 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5eb5696-e150-477c-b145-2ce5f4161410" containerName="route-controller-manager" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.240030 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5eb5696-e150-477c-b145-2ce5f4161410" containerName="route-controller-manager" Mar 10 00:10:36 crc kubenswrapper[4906]: E0310 00:10:36.240063 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="804ce14a-8267-4a2e-a90e-0572096372ef" containerName="controller-manager" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.240092 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="804ce14a-8267-4a2e-a90e-0572096372ef" containerName="controller-manager" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.240270 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="804ce14a-8267-4a2e-a90e-0572096372ef" containerName="controller-manager" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.240290 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5eb5696-e150-477c-b145-2ce5f4161410" containerName="route-controller-manager" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.240974 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.256661 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr"] Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.382611 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w928n\" (UniqueName: \"kubernetes.io/projected/9480e59d-2e02-4f21-af67-923be5039dd2-kube-api-access-w928n\") pod \"controller-manager-5866d8f9d7-xmnwr\" (UID: \"9480e59d-2e02-4f21-af67-923be5039dd2\") " pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.383011 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9480e59d-2e02-4f21-af67-923be5039dd2-client-ca\") pod \"controller-manager-5866d8f9d7-xmnwr\" (UID: \"9480e59d-2e02-4f21-af67-923be5039dd2\") " pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.383308 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9480e59d-2e02-4f21-af67-923be5039dd2-proxy-ca-bundles\") pod \"controller-manager-5866d8f9d7-xmnwr\" (UID: \"9480e59d-2e02-4f21-af67-923be5039dd2\") " pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.383461 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9480e59d-2e02-4f21-af67-923be5039dd2-config\") pod \"controller-manager-5866d8f9d7-xmnwr\" (UID: \"9480e59d-2e02-4f21-af67-923be5039dd2\") " pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.383753 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9480e59d-2e02-4f21-af67-923be5039dd2-serving-cert\") pod \"controller-manager-5866d8f9d7-xmnwr\" (UID: \"9480e59d-2e02-4f21-af67-923be5039dd2\") " pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.385484 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" event={"ID":"b5eb5696-e150-477c-b145-2ce5f4161410","Type":"ContainerDied","Data":"04888686209a042f3306462ed596a8c04bd7118f0a66b598ab193109f0faff58"} Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.385664 4906 scope.go:117] "RemoveContainer" containerID="b5720c002243b3ca69d5c4d46d0d91442e2341bc795ff578132042a6ab360b9e" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.385879 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.390792 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" event={"ID":"804ce14a-8267-4a2e-a90e-0572096372ef","Type":"ContainerDied","Data":"c16120d4680e094882c77f761c4ec89be844bb7473f75d0752e095499d1c1172"} Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.390867 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76f87444f8-c5ww8" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.412157 4906 scope.go:117] "RemoveContainer" containerID="fda7be75d8a2f8e53ba2b2d838e62068b033c991ed85bb33108deb3630c4a3c7" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.423905 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76f87444f8-c5ww8"] Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.428908 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-76f87444f8-c5ww8"] Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.433570 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt"] Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.436835 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8bc597f48-dwzlt"] Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.484765 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9480e59d-2e02-4f21-af67-923be5039dd2-client-ca\") pod \"controller-manager-5866d8f9d7-xmnwr\" (UID: \"9480e59d-2e02-4f21-af67-923be5039dd2\") " pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.484852 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9480e59d-2e02-4f21-af67-923be5039dd2-proxy-ca-bundles\") pod \"controller-manager-5866d8f9d7-xmnwr\" (UID: \"9480e59d-2e02-4f21-af67-923be5039dd2\") " pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.484895 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9480e59d-2e02-4f21-af67-923be5039dd2-config\") pod \"controller-manager-5866d8f9d7-xmnwr\" (UID: \"9480e59d-2e02-4f21-af67-923be5039dd2\") " pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.484918 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9480e59d-2e02-4f21-af67-923be5039dd2-serving-cert\") pod \"controller-manager-5866d8f9d7-xmnwr\" (UID: \"9480e59d-2e02-4f21-af67-923be5039dd2\") " pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.484954 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w928n\" (UniqueName: \"kubernetes.io/projected/9480e59d-2e02-4f21-af67-923be5039dd2-kube-api-access-w928n\") pod \"controller-manager-5866d8f9d7-xmnwr\" (UID: \"9480e59d-2e02-4f21-af67-923be5039dd2\") " pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.486505 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9480e59d-2e02-4f21-af67-923be5039dd2-client-ca\") pod \"controller-manager-5866d8f9d7-xmnwr\" (UID: \"9480e59d-2e02-4f21-af67-923be5039dd2\") " pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.486792 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9480e59d-2e02-4f21-af67-923be5039dd2-config\") pod \"controller-manager-5866d8f9d7-xmnwr\" (UID: \"9480e59d-2e02-4f21-af67-923be5039dd2\") " pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.487696 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9480e59d-2e02-4f21-af67-923be5039dd2-proxy-ca-bundles\") pod \"controller-manager-5866d8f9d7-xmnwr\" (UID: \"9480e59d-2e02-4f21-af67-923be5039dd2\") " pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.493120 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9480e59d-2e02-4f21-af67-923be5039dd2-serving-cert\") pod \"controller-manager-5866d8f9d7-xmnwr\" (UID: \"9480e59d-2e02-4f21-af67-923be5039dd2\") " pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.511254 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w928n\" (UniqueName: \"kubernetes.io/projected/9480e59d-2e02-4f21-af67-923be5039dd2-kube-api-access-w928n\") pod \"controller-manager-5866d8f9d7-xmnwr\" (UID: \"9480e59d-2e02-4f21-af67-923be5039dd2\") " pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.586829 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.586969 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="804ce14a-8267-4a2e-a90e-0572096372ef" path="/var/lib/kubelet/pods/804ce14a-8267-4a2e-a90e-0572096372ef/volumes" Mar 10 00:10:36 crc kubenswrapper[4906]: I0310 00:10:36.588393 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5eb5696-e150-477c-b145-2ce5f4161410" path="/var/lib/kubelet/pods/b5eb5696-e150-477c-b145-2ce5f4161410/volumes" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.027873 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr"] Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.234936 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm"] Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.235907 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.238077 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.238221 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.238328 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.238464 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.238572 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.238706 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.250954 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm"] Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.397414 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" event={"ID":"9480e59d-2e02-4f21-af67-923be5039dd2","Type":"ContainerStarted","Data":"568012848ae9cd2f0f6ea32052a9ee78ac56667e5f58ba81808196f54955cf46"} Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.397463 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" event={"ID":"9480e59d-2e02-4f21-af67-923be5039dd2","Type":"ContainerStarted","Data":"2712d4d79cf850b76a6533f50c77bdefc28fbc6fb8e7c2fa8890fcbe4895e1f0"} Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.398704 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.408443 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecc41af8-50e6-4152-a367-f70fac192941-client-ca\") pod \"route-controller-manager-6985c74777-lb5tm\" (UID: \"ecc41af8-50e6-4152-a367-f70fac192941\") " pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.408535 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecc41af8-50e6-4152-a367-f70fac192941-config\") pod \"route-controller-manager-6985c74777-lb5tm\" (UID: \"ecc41af8-50e6-4152-a367-f70fac192941\") " pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.408577 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecc41af8-50e6-4152-a367-f70fac192941-serving-cert\") pod \"route-controller-manager-6985c74777-lb5tm\" (UID: \"ecc41af8-50e6-4152-a367-f70fac192941\") " pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.408710 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk7zz\" (UniqueName: \"kubernetes.io/projected/ecc41af8-50e6-4152-a367-f70fac192941-kube-api-access-hk7zz\") pod \"route-controller-manager-6985c74777-lb5tm\" (UID: \"ecc41af8-50e6-4152-a367-f70fac192941\") " pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.417867 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.421541 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5866d8f9d7-xmnwr" podStartSLOduration=2.421525701 podStartE2EDuration="2.421525701s" podCreationTimestamp="2026-03-10 00:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:37.420400039 +0000 UTC m=+263.568295151" watchObservedRunningTime="2026-03-10 00:10:37.421525701 +0000 UTC m=+263.569420813" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.509284 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecc41af8-50e6-4152-a367-f70fac192941-serving-cert\") pod \"route-controller-manager-6985c74777-lb5tm\" (UID: \"ecc41af8-50e6-4152-a367-f70fac192941\") " pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.509371 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk7zz\" (UniqueName: \"kubernetes.io/projected/ecc41af8-50e6-4152-a367-f70fac192941-kube-api-access-hk7zz\") pod \"route-controller-manager-6985c74777-lb5tm\" (UID: \"ecc41af8-50e6-4152-a367-f70fac192941\") " pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.509413 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecc41af8-50e6-4152-a367-f70fac192941-client-ca\") pod \"route-controller-manager-6985c74777-lb5tm\" (UID: \"ecc41af8-50e6-4152-a367-f70fac192941\") " pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.509458 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecc41af8-50e6-4152-a367-f70fac192941-config\") pod \"route-controller-manager-6985c74777-lb5tm\" (UID: \"ecc41af8-50e6-4152-a367-f70fac192941\") " pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.510679 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecc41af8-50e6-4152-a367-f70fac192941-client-ca\") pod \"route-controller-manager-6985c74777-lb5tm\" (UID: \"ecc41af8-50e6-4152-a367-f70fac192941\") " pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.510837 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecc41af8-50e6-4152-a367-f70fac192941-config\") pod \"route-controller-manager-6985c74777-lb5tm\" (UID: \"ecc41af8-50e6-4152-a367-f70fac192941\") " pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.517285 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecc41af8-50e6-4152-a367-f70fac192941-serving-cert\") pod \"route-controller-manager-6985c74777-lb5tm\" (UID: \"ecc41af8-50e6-4152-a367-f70fac192941\") " pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.533277 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk7zz\" (UniqueName: \"kubernetes.io/projected/ecc41af8-50e6-4152-a367-f70fac192941-kube-api-access-hk7zz\") pod \"route-controller-manager-6985c74777-lb5tm\" (UID: \"ecc41af8-50e6-4152-a367-f70fac192941\") " pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.549352 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" Mar 10 00:10:37 crc kubenswrapper[4906]: I0310 00:10:37.961445 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm"] Mar 10 00:10:38 crc kubenswrapper[4906]: I0310 00:10:38.405765 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" event={"ID":"ecc41af8-50e6-4152-a367-f70fac192941","Type":"ContainerStarted","Data":"5569ec2d4fbb8e615a5ae4ddfacfffe192184102ed3155999375b42c92f81418"} Mar 10 00:10:38 crc kubenswrapper[4906]: I0310 00:10:38.406311 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" event={"ID":"ecc41af8-50e6-4152-a367-f70fac192941","Type":"ContainerStarted","Data":"366fcc82e3656316cccaf3a0747524b746588e7739b94a810c19b3bf153d47a2"} Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.412467 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.422554 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.453905 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6985c74777-lb5tm" podStartSLOduration=4.453875591 podStartE2EDuration="4.453875591s" podCreationTimestamp="2026-03-10 00:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:10:38.437463524 +0000 UTC m=+264.585358636" watchObservedRunningTime="2026-03-10 00:10:39.453875591 +0000 UTC m=+265.601770733" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.500426 4906 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.501341 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.501769 4906 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.502369 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229" gracePeriod=15 Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.502404 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be" gracePeriod=15 Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.502467 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435" gracePeriod=15 Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.502502 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2" gracePeriod=15 Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.502467 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41" gracePeriod=15 Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.538951 4906 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.539156 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.547593 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.547673 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.547852 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.547972 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.548115 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: E0310 00:10:39.577142 4906 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.580010 4906 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 00:10:39 crc kubenswrapper[4906]: E0310 00:10:39.580590 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.580778 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 00:10:39 crc kubenswrapper[4906]: E0310 00:10:39.580914 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.581047 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 00:10:39 crc kubenswrapper[4906]: E0310 00:10:39.581154 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.581253 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 00:10:39 crc kubenswrapper[4906]: E0310 00:10:39.581340 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.581439 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:10:39 crc kubenswrapper[4906]: E0310 00:10:39.581539 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.581624 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:10:39 crc kubenswrapper[4906]: E0310 00:10:39.581778 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.581897 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:10:39 crc kubenswrapper[4906]: E0310 00:10:39.582013 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.582137 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 00:10:39 crc kubenswrapper[4906]: E0310 00:10:39.582267 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.582371 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.582689 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.582829 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.582946 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.583080 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.583194 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.583305 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 00:10:39 crc kubenswrapper[4906]: E0310 00:10:39.583618 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.583784 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.584067 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.584538 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.649054 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.649237 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.649329 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.649163 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.649474 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.649557 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.649684 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.649742 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.649884 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.650003 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.650137 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.650112 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.650335 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.752259 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.752332 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.752388 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.752396 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.752481 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.752510 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.992493 4906 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Mar 10 00:10:39 crc kubenswrapper[4906]: I0310 00:10:39.992751 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Mar 10 00:10:40 crc kubenswrapper[4906]: I0310 00:10:40.425868 4906 generic.go:334] "Generic (PLEG): container finished" podID="ffc7eabb-e49f-497c-912d-f997514651c5" containerID="dc6fcf0a68b971ef82fe314f5aaa86e92f754f07405cc299eeb8aa43fba66717" exitCode=0 Mar 10 00:10:40 crc kubenswrapper[4906]: I0310 00:10:40.425967 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ffc7eabb-e49f-497c-912d-f997514651c5","Type":"ContainerDied","Data":"dc6fcf0a68b971ef82fe314f5aaa86e92f754f07405cc299eeb8aa43fba66717"} Mar 10 00:10:40 crc kubenswrapper[4906]: I0310 00:10:40.428868 4906 status_manager.go:851] "Failed to get status for pod" podUID="ffc7eabb-e49f-497c-912d-f997514651c5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:40 crc kubenswrapper[4906]: I0310 00:10:40.429783 4906 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:40 crc kubenswrapper[4906]: I0310 00:10:40.430263 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 00:10:40 crc kubenswrapper[4906]: I0310 00:10:40.433314 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 00:10:40 crc kubenswrapper[4906]: I0310 00:10:40.434428 4906 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be" exitCode=0 Mar 10 00:10:40 crc kubenswrapper[4906]: I0310 00:10:40.434456 4906 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41" exitCode=0 Mar 10 00:10:40 crc kubenswrapper[4906]: I0310 00:10:40.434464 4906 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435" exitCode=0 Mar 10 00:10:40 crc kubenswrapper[4906]: I0310 00:10:40.434471 4906 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2" exitCode=2 Mar 10 00:10:40 crc kubenswrapper[4906]: I0310 00:10:40.434493 4906 scope.go:117] "RemoveContainer" containerID="c4ab422d45102787731cdbfd6d0e400974761d5af01f0a43d04cc25228c1db30" Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.447843 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.897746 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.898937 4906 status_manager.go:851] "Failed to get status for pod" podUID="ffc7eabb-e49f-497c-912d-f997514651c5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.905772 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.907686 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.908290 4906 status_manager.go:851] "Failed to get status for pod" podUID="ffc7eabb-e49f-497c-912d-f997514651c5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.908531 4906 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.997767 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffc7eabb-e49f-497c-912d-f997514651c5-kubelet-dir\") pod \"ffc7eabb-e49f-497c-912d-f997514651c5\" (UID: \"ffc7eabb-e49f-497c-912d-f997514651c5\") " Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.997865 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.997874 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffc7eabb-e49f-497c-912d-f997514651c5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ffc7eabb-e49f-497c-912d-f997514651c5" (UID: "ffc7eabb-e49f-497c-912d-f997514651c5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.997954 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffc7eabb-e49f-497c-912d-f997514651c5-kube-api-access\") pod \"ffc7eabb-e49f-497c-912d-f997514651c5\" (UID: \"ffc7eabb-e49f-497c-912d-f997514651c5\") " Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.997971 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.998030 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.998039 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.998064 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.998110 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ffc7eabb-e49f-497c-912d-f997514651c5-var-lock\") pod \"ffc7eabb-e49f-497c-912d-f997514651c5\" (UID: \"ffc7eabb-e49f-497c-912d-f997514651c5\") " Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.998584 4906 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffc7eabb-e49f-497c-912d-f997514651c5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.998610 4906 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.998622 4906 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.998712 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffc7eabb-e49f-497c-912d-f997514651c5-var-lock" (OuterVolumeSpecName: "var-lock") pod "ffc7eabb-e49f-497c-912d-f997514651c5" (UID: "ffc7eabb-e49f-497c-912d-f997514651c5"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:10:41 crc kubenswrapper[4906]: I0310 00:10:41.998740 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.008835 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffc7eabb-e49f-497c-912d-f997514651c5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ffc7eabb-e49f-497c-912d-f997514651c5" (UID: "ffc7eabb-e49f-497c-912d-f997514651c5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.099586 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffc7eabb-e49f-497c-912d-f997514651c5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.099625 4906 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.099657 4906 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ffc7eabb-e49f-497c-912d-f997514651c5-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.457039 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ffc7eabb-e49f-497c-912d-f997514651c5","Type":"ContainerDied","Data":"9b5b4b7f7d1c2388ae0a54f8e94b505c981764d43eaa2c0cb8372a937bced375"} Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.457085 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b5b4b7f7d1c2388ae0a54f8e94b505c981764d43eaa2c0cb8372a937bced375" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.457131 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.460018 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.460911 4906 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229" exitCode=0 Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.460979 4906 scope.go:117] "RemoveContainer" containerID="e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.460987 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.487732 4906 scope.go:117] "RemoveContainer" containerID="03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.488236 4906 status_manager.go:851] "Failed to get status for pod" podUID="ffc7eabb-e49f-497c-912d-f997514651c5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.488919 4906 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.489894 4906 status_manager.go:851] "Failed to get status for pod" podUID="ffc7eabb-e49f-497c-912d-f997514651c5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.490355 4906 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.501295 4906 scope.go:117] "RemoveContainer" containerID="1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.518669 4906 scope.go:117] "RemoveContainer" containerID="e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.537776 4906 scope.go:117] "RemoveContainer" containerID="a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.561489 4906 scope.go:117] "RemoveContainer" containerID="b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.593086 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.599777 4906 scope.go:117] "RemoveContainer" containerID="e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be" Mar 10 00:10:42 crc kubenswrapper[4906]: E0310 00:10:42.600365 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\": container with ID starting with e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be not found: ID does not exist" containerID="e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.600399 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be"} err="failed to get container status \"e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\": rpc error: code = NotFound desc = could not find container \"e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be\": container with ID starting with e75634c35b3e26fec38c8aa89f5a8055117f60566c8d46cf9ec28cad40c688be not found: ID does not exist" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.600423 4906 scope.go:117] "RemoveContainer" containerID="03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41" Mar 10 00:10:42 crc kubenswrapper[4906]: E0310 00:10:42.601078 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\": container with ID starting with 03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41 not found: ID does not exist" containerID="03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.601109 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41"} err="failed to get container status \"03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\": rpc error: code = NotFound desc = could not find container \"03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41\": container with ID starting with 03c49063cf40b4074c120bf78de05481a9e5638b11137a47d73d15b3e1a43f41 not found: ID does not exist" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.601127 4906 scope.go:117] "RemoveContainer" containerID="1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435" Mar 10 00:10:42 crc kubenswrapper[4906]: E0310 00:10:42.601702 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\": container with ID starting with 1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435 not found: ID does not exist" containerID="1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.601756 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435"} err="failed to get container status \"1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\": rpc error: code = NotFound desc = could not find container \"1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435\": container with ID starting with 1237a4e233c776796482d73c65c85e5d86e592d6a7f53a7b76fea4529fe9e435 not found: ID does not exist" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.601793 4906 scope.go:117] "RemoveContainer" containerID="e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2" Mar 10 00:10:42 crc kubenswrapper[4906]: E0310 00:10:42.602207 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\": container with ID starting with e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2 not found: ID does not exist" containerID="e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.602237 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2"} err="failed to get container status \"e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\": rpc error: code = NotFound desc = could not find container \"e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2\": container with ID starting with e445f27670002f5ac446f9be56896acdbf9e3e30801b073b231cc8ce7ddac7c2 not found: ID does not exist" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.602254 4906 scope.go:117] "RemoveContainer" containerID="a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229" Mar 10 00:10:42 crc kubenswrapper[4906]: E0310 00:10:42.602488 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\": container with ID starting with a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229 not found: ID does not exist" containerID="a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.602514 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229"} err="failed to get container status \"a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\": rpc error: code = NotFound desc = could not find container \"a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229\": container with ID starting with a1ce33b4800ac33b19f03506f197894abd2b965c34db7b506d7f06e8824b0229 not found: ID does not exist" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.602536 4906 scope.go:117] "RemoveContainer" containerID="b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964" Mar 10 00:10:42 crc kubenswrapper[4906]: E0310 00:10:42.603079 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\": container with ID starting with b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964 not found: ID does not exist" containerID="b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964" Mar 10 00:10:42 crc kubenswrapper[4906]: I0310 00:10:42.603109 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964"} err="failed to get container status \"b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\": rpc error: code = NotFound desc = could not find container \"b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964\": container with ID starting with b6461cb5b49b70adf1f5cba9a666dca170f5955a2014a2d1e693bfb82373f964 not found: ID does not exist" Mar 10 00:10:44 crc kubenswrapper[4906]: E0310 00:10:44.564766 4906 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.179:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:44 crc kubenswrapper[4906]: I0310 00:10:44.565780 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:44 crc kubenswrapper[4906]: I0310 00:10:44.579878 4906 status_manager.go:851] "Failed to get status for pod" podUID="ffc7eabb-e49f-497c-912d-f997514651c5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:44 crc kubenswrapper[4906]: E0310 00:10:44.593224 4906 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.179:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b525e9ba9ace5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:10:44.592413925 +0000 UTC m=+270.740309087,LastTimestamp:2026-03-10 00:10:44.592413925 +0000 UTC m=+270.740309087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:10:45 crc kubenswrapper[4906]: I0310 00:10:45.480494 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"fbf5da406f0ffae7503bc7940fdf80767f4e05d6974f27a267f036c83a678df4"} Mar 10 00:10:45 crc kubenswrapper[4906]: I0310 00:10:45.481064 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"8146d09a3d2c33f1c5a4eb69bbd5ea95c9e3e94c5e667d2e799a734fa08e6c59"} Mar 10 00:10:45 crc kubenswrapper[4906]: E0310 00:10:45.482150 4906 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.179:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:10:45 crc kubenswrapper[4906]: I0310 00:10:45.482153 4906 status_manager.go:851] "Failed to get status for pod" podUID="ffc7eabb-e49f-497c-912d-f997514651c5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:46 crc kubenswrapper[4906]: E0310 00:10:46.691428 4906 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.179:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b525e9ba9ace5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 00:10:44.592413925 +0000 UTC m=+270.740309087,LastTimestamp:2026-03-10 00:10:44.592413925 +0000 UTC m=+270.740309087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 00:10:48 crc kubenswrapper[4906]: E0310 00:10:48.503390 4906 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:48 crc kubenswrapper[4906]: E0310 00:10:48.504629 4906 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:48 crc kubenswrapper[4906]: E0310 00:10:48.505545 4906 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:48 crc kubenswrapper[4906]: E0310 00:10:48.506978 4906 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:48 crc kubenswrapper[4906]: E0310 00:10:48.507717 4906 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:48 crc kubenswrapper[4906]: I0310 00:10:48.507802 4906 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 10 00:10:48 crc kubenswrapper[4906]: E0310 00:10:48.508459 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="200ms" Mar 10 00:10:48 crc kubenswrapper[4906]: E0310 00:10:48.710330 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="400ms" Mar 10 00:10:49 crc kubenswrapper[4906]: E0310 00:10:49.111899 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="800ms" Mar 10 00:10:49 crc kubenswrapper[4906]: E0310 00:10:49.913370 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="1.6s" Mar 10 00:10:51 crc kubenswrapper[4906]: E0310 00:10:51.514847 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="3.2s" Mar 10 00:10:53 crc kubenswrapper[4906]: I0310 00:10:53.576406 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:53 crc kubenswrapper[4906]: I0310 00:10:53.578111 4906 status_manager.go:851] "Failed to get status for pod" podUID="ffc7eabb-e49f-497c-912d-f997514651c5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:53 crc kubenswrapper[4906]: I0310 00:10:53.593284 4906 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8518e6b0-48c0-4076-a58a-e2ba8751c90f" Mar 10 00:10:53 crc kubenswrapper[4906]: I0310 00:10:53.593316 4906 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8518e6b0-48c0-4076-a58a-e2ba8751c90f" Mar 10 00:10:53 crc kubenswrapper[4906]: E0310 00:10:53.593801 4906 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:53 crc kubenswrapper[4906]: I0310 00:10:53.594251 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:53 crc kubenswrapper[4906]: W0310 00:10:53.616546 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-34fb715f3e7a97ec26822ce82aa2fb38687e87285afeb5c0a5581ffca2ec0420 WatchSource:0}: Error finding container 34fb715f3e7a97ec26822ce82aa2fb38687e87285afeb5c0a5581ffca2ec0420: Status 404 returned error can't find the container with id 34fb715f3e7a97ec26822ce82aa2fb38687e87285afeb5c0a5581ffca2ec0420 Mar 10 00:10:54 crc kubenswrapper[4906]: I0310 00:10:54.553075 4906 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="7f05ff9c536885c6b001f8bede56f866b56a66b5f61696d63b69201edd698925" exitCode=0 Mar 10 00:10:54 crc kubenswrapper[4906]: I0310 00:10:54.553214 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"7f05ff9c536885c6b001f8bede56f866b56a66b5f61696d63b69201edd698925"} Mar 10 00:10:54 crc kubenswrapper[4906]: I0310 00:10:54.553482 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"34fb715f3e7a97ec26822ce82aa2fb38687e87285afeb5c0a5581ffca2ec0420"} Mar 10 00:10:54 crc kubenswrapper[4906]: I0310 00:10:54.554009 4906 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8518e6b0-48c0-4076-a58a-e2ba8751c90f" Mar 10 00:10:54 crc kubenswrapper[4906]: I0310 00:10:54.554050 4906 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8518e6b0-48c0-4076-a58a-e2ba8751c90f" Mar 10 00:10:54 crc kubenswrapper[4906]: I0310 00:10:54.554451 4906 status_manager.go:851] "Failed to get status for pod" podUID="ffc7eabb-e49f-497c-912d-f997514651c5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:54 crc kubenswrapper[4906]: E0310 00:10:54.554612 4906 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:54 crc kubenswrapper[4906]: I0310 00:10:54.557030 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 00:10:54 crc kubenswrapper[4906]: I0310 00:10:54.557910 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 00:10:54 crc kubenswrapper[4906]: I0310 00:10:54.558098 4906 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2" exitCode=1 Mar 10 00:10:54 crc kubenswrapper[4906]: I0310 00:10:54.558143 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2"} Mar 10 00:10:54 crc kubenswrapper[4906]: I0310 00:10:54.558725 4906 scope.go:117] "RemoveContainer" containerID="aefd20293d0c60b7686525f505af7f8d1ddd1ab1b3c63f95787470f6ef538df2" Mar 10 00:10:54 crc kubenswrapper[4906]: I0310 00:10:54.558972 4906 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:54 crc kubenswrapper[4906]: I0310 00:10:54.559385 4906 status_manager.go:851] "Failed to get status for pod" podUID="ffc7eabb-e49f-497c-912d-f997514651c5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:54 crc kubenswrapper[4906]: I0310 00:10:54.607053 4906 status_manager.go:851] "Failed to get status for pod" podUID="ffc7eabb-e49f-497c-912d-f997514651c5" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:54 crc kubenswrapper[4906]: I0310 00:10:54.608141 4906 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:54 crc kubenswrapper[4906]: I0310 00:10:54.608679 4906 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.179:6443: connect: connection refused" Mar 10 00:10:54 crc kubenswrapper[4906]: E0310 00:10:54.716951 4906 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.179:6443: connect: connection refused" interval="6.4s" Mar 10 00:10:55 crc kubenswrapper[4906]: I0310 00:10:55.574440 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"523f8f6d76e353661876e8a52fa403709494280a46d190583a0544c05e264758"} Mar 10 00:10:55 crc kubenswrapper[4906]: I0310 00:10:55.574488 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"64e9287a7a600cc36eba29075c7449dff51c7c17991c243f8621af6efed95a68"} Mar 10 00:10:55 crc kubenswrapper[4906]: I0310 00:10:55.574499 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"11fe3893d2bb8576fb3913dabbc5178c15e702852ba672f10de17b1e0dbc8b3e"} Mar 10 00:10:55 crc kubenswrapper[4906]: I0310 00:10:55.578596 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 00:10:55 crc kubenswrapper[4906]: I0310 00:10:55.579128 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 00:10:55 crc kubenswrapper[4906]: I0310 00:10:55.579176 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"384b81e4066d5fcf557713c41b177b7b95ad2dc8d92564c9e34368a2c2a73f5b"} Mar 10 00:10:56 crc kubenswrapper[4906]: I0310 00:10:56.586082 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1e16958b2df9e348846667d9ae0b792032377ec91d0e861ff0767fa3d5d0d9cf"} Mar 10 00:10:56 crc kubenswrapper[4906]: I0310 00:10:56.586339 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:56 crc kubenswrapper[4906]: I0310 00:10:56.586349 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bc90df7a71f8aa89aa68ffd992a1d5b8380644ee30b22c72321dd8a1b8c6a12f"} Mar 10 00:10:56 crc kubenswrapper[4906]: I0310 00:10:56.586389 4906 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8518e6b0-48c0-4076-a58a-e2ba8751c90f" Mar 10 00:10:56 crc kubenswrapper[4906]: I0310 00:10:56.586412 4906 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8518e6b0-48c0-4076-a58a-e2ba8751c90f" Mar 10 00:10:57 crc kubenswrapper[4906]: I0310 00:10:57.611797 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:10:58 crc kubenswrapper[4906]: I0310 00:10:58.594356 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:58 crc kubenswrapper[4906]: I0310 00:10:58.594651 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:10:58 crc kubenswrapper[4906]: I0310 00:10:58.600674 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:00 crc kubenswrapper[4906]: I0310 00:11:00.502385 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:11:00 crc kubenswrapper[4906]: I0310 00:11:00.502463 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:11:00 crc kubenswrapper[4906]: I0310 00:11:00.502778 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:11:00 crc kubenswrapper[4906]: I0310 00:11:00.503622 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a"} pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:11:00 crc kubenswrapper[4906]: I0310 00:11:00.503769 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" containerID="cri-o://f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a" gracePeriod=600 Mar 10 00:11:01 crc kubenswrapper[4906]: I0310 00:11:01.599866 4906 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:01 crc kubenswrapper[4906]: I0310 00:11:01.604511 4906 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8518e6b0-48c0-4076-a58a-e2ba8751c90f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fe3893d2bb8576fb3913dabbc5178c15e702852ba672f10de17b1e0dbc8b3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:10:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://523f8f6d76e353661876e8a52fa403709494280a46d190583a0544c05e264758\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e9287a7a600cc36eba29075c7449dff51c7c17991c243f8621af6efed95a68\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e16958b2df9e348846667d9ae0b792032377ec91d0e861ff0767fa3d5d0d9cf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:10:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc90df7a71f8aa89aa68ffd992a1d5b8380644ee30b22c72321dd8a1b8c6a12f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T00:10:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": pods \"kube-apiserver-crc\" not found" Mar 10 00:11:01 crc kubenswrapper[4906]: I0310 00:11:01.620368 4906 generic.go:334] "Generic (PLEG): container finished" podID="72d61d35-0a64-45a5-8df3-9c429727deba" containerID="f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a" exitCode=0 Mar 10 00:11:01 crc kubenswrapper[4906]: I0310 00:11:01.620438 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerDied","Data":"f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a"} Mar 10 00:11:01 crc kubenswrapper[4906]: I0310 00:11:01.620493 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerStarted","Data":"8e93da60220c7a28ef01def4f9a5029323cddf710d54a9d14b770a8f7137b36e"} Mar 10 00:11:01 crc kubenswrapper[4906]: I0310 00:11:01.625843 4906 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="fba10198-0bab-4f52-9328-5637e8aae714" Mar 10 00:11:01 crc kubenswrapper[4906]: I0310 00:11:01.844728 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:11:01 crc kubenswrapper[4906]: I0310 00:11:01.848680 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:11:02 crc kubenswrapper[4906]: I0310 00:11:02.626422 4906 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8518e6b0-48c0-4076-a58a-e2ba8751c90f" Mar 10 00:11:02 crc kubenswrapper[4906]: I0310 00:11:02.626761 4906 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8518e6b0-48c0-4076-a58a-e2ba8751c90f" Mar 10 00:11:02 crc kubenswrapper[4906]: I0310 00:11:02.632442 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:03 crc kubenswrapper[4906]: I0310 00:11:03.632261 4906 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8518e6b0-48c0-4076-a58a-e2ba8751c90f" Mar 10 00:11:03 crc kubenswrapper[4906]: I0310 00:11:03.632305 4906 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8518e6b0-48c0-4076-a58a-e2ba8751c90f" Mar 10 00:11:04 crc kubenswrapper[4906]: I0310 00:11:04.617174 4906 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="fba10198-0bab-4f52-9328-5637e8aae714" Mar 10 00:11:07 crc kubenswrapper[4906]: I0310 00:11:07.616114 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 00:11:11 crc kubenswrapper[4906]: I0310 00:11:11.861013 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 00:11:12 crc kubenswrapper[4906]: I0310 00:11:12.043141 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 00:11:12 crc kubenswrapper[4906]: I0310 00:11:12.272331 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 00:11:12 crc kubenswrapper[4906]: I0310 00:11:12.367809 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 00:11:12 crc kubenswrapper[4906]: I0310 00:11:12.563224 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 00:11:12 crc kubenswrapper[4906]: I0310 00:11:12.740793 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 00:11:12 crc kubenswrapper[4906]: I0310 00:11:12.787892 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 00:11:12 crc kubenswrapper[4906]: I0310 00:11:12.861511 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 00:11:13 crc kubenswrapper[4906]: I0310 00:11:13.122214 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 00:11:13 crc kubenswrapper[4906]: I0310 00:11:13.132371 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 00:11:13 crc kubenswrapper[4906]: I0310 00:11:13.439927 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 00:11:13 crc kubenswrapper[4906]: I0310 00:11:13.496082 4906 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 00:11:13 crc kubenswrapper[4906]: I0310 00:11:13.572929 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 00:11:13 crc kubenswrapper[4906]: I0310 00:11:13.641017 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 00:11:13 crc kubenswrapper[4906]: I0310 00:11:13.705226 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 00:11:13 crc kubenswrapper[4906]: I0310 00:11:13.982486 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 00:11:14 crc kubenswrapper[4906]: I0310 00:11:14.163294 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 00:11:14 crc kubenswrapper[4906]: I0310 00:11:14.209460 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 00:11:14 crc kubenswrapper[4906]: I0310 00:11:14.357265 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 00:11:14 crc kubenswrapper[4906]: I0310 00:11:14.428622 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 00:11:14 crc kubenswrapper[4906]: I0310 00:11:14.434350 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 00:11:14 crc kubenswrapper[4906]: I0310 00:11:14.509103 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 00:11:14 crc kubenswrapper[4906]: I0310 00:11:14.618335 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 00:11:14 crc kubenswrapper[4906]: I0310 00:11:14.815302 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 00:11:14 crc kubenswrapper[4906]: I0310 00:11:14.838610 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 00:11:14 crc kubenswrapper[4906]: I0310 00:11:14.906040 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 00:11:14 crc kubenswrapper[4906]: I0310 00:11:14.983602 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.045555 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.069263 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.160281 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.174311 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.211370 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.265861 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.290464 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.294985 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.318185 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.341477 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.365153 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.438438 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.460529 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.479161 4906 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.527448 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.537941 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.585514 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.621291 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.640599 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.656240 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.713244 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.762892 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.765380 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.838917 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.904127 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 00:11:15 crc kubenswrapper[4906]: I0310 00:11:15.923715 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.019816 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.091107 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.148286 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.160497 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.169697 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.201470 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.215541 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.256123 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.277509 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.286589 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.377010 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.544877 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.586491 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.610816 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.612296 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.655784 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.676303 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.763664 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.801498 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.869805 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.880474 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 00:11:16 crc kubenswrapper[4906]: I0310 00:11:16.930705 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.015659 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.020220 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.026582 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.055628 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.062470 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.120953 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.140682 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.167266 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.195276 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.201356 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.221400 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.282968 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.326850 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.428410 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.514152 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.536591 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.628586 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.695784 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.770182 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.856939 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.930427 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.977812 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 00:11:17 crc kubenswrapper[4906]: I0310 00:11:17.986629 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.013222 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.081420 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.090677 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.109361 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.133795 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.247306 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.315127 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.326677 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.391512 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.430824 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.493620 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.498138 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.529821 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.546055 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.657190 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.678719 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.825285 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.878817 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.888792 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.895795 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.910271 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.930963 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 00:11:18 crc kubenswrapper[4906]: I0310 00:11:18.931669 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.096681 4906 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.100300 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.147217 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.182463 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.287250 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.423498 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.488939 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.516039 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.532480 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.539382 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.558502 4906 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.567272 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.567377 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.576264 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.593499 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.593460674 podStartE2EDuration="18.593460674s" podCreationTimestamp="2026-03-10 00:11:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:11:19.592307221 +0000 UTC m=+305.740202363" watchObservedRunningTime="2026-03-10 00:11:19.593460674 +0000 UTC m=+305.741355826" Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.808261 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.819510 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 00:11:19 crc kubenswrapper[4906]: I0310 00:11:19.980295 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.003610 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.064780 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.075168 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.109183 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.187964 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.213142 4906 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.273501 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.290888 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.401966 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.477438 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.538378 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.542659 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.610058 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.625744 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.639032 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.641256 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.650262 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.676192 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.720702 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.866271 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.942659 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.966717 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 00:11:20 crc kubenswrapper[4906]: I0310 00:11:20.986015 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 00:11:21 crc kubenswrapper[4906]: I0310 00:11:21.076495 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 00:11:21 crc kubenswrapper[4906]: I0310 00:11:21.109458 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 00:11:21 crc kubenswrapper[4906]: I0310 00:11:21.156722 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 00:11:21 crc kubenswrapper[4906]: I0310 00:11:21.216750 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 00:11:21 crc kubenswrapper[4906]: I0310 00:11:21.411367 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 00:11:21 crc kubenswrapper[4906]: I0310 00:11:21.512766 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 00:11:21 crc kubenswrapper[4906]: I0310 00:11:21.520823 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 00:11:21 crc kubenswrapper[4906]: I0310 00:11:21.616242 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 00:11:21 crc kubenswrapper[4906]: I0310 00:11:21.694003 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 00:11:21 crc kubenswrapper[4906]: I0310 00:11:21.705265 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 00:11:21 crc kubenswrapper[4906]: I0310 00:11:21.793301 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 00:11:21 crc kubenswrapper[4906]: I0310 00:11:21.804342 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 00:11:21 crc kubenswrapper[4906]: I0310 00:11:21.903255 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 00:11:21 crc kubenswrapper[4906]: I0310 00:11:21.933817 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 00:11:21 crc kubenswrapper[4906]: I0310 00:11:21.947562 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 00:11:21 crc kubenswrapper[4906]: I0310 00:11:21.959990 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 00:11:21 crc kubenswrapper[4906]: I0310 00:11:21.987228 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.118347 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.179709 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.214451 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.228743 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.288499 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.339926 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.423421 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.501116 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.551950 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.561851 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.573942 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.583177 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.634287 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.634656 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.664290 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.705835 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.746023 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.771188 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.852936 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.893754 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.898195 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.949104 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 00:11:22 crc kubenswrapper[4906]: I0310 00:11:22.994319 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 00:11:23 crc kubenswrapper[4906]: I0310 00:11:23.046372 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 00:11:23 crc kubenswrapper[4906]: I0310 00:11:23.048201 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 00:11:23 crc kubenswrapper[4906]: I0310 00:11:23.111587 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 00:11:23 crc kubenswrapper[4906]: I0310 00:11:23.115986 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 00:11:23 crc kubenswrapper[4906]: I0310 00:11:23.157922 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 00:11:23 crc kubenswrapper[4906]: I0310 00:11:23.382890 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 00:11:23 crc kubenswrapper[4906]: I0310 00:11:23.512891 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 00:11:23 crc kubenswrapper[4906]: I0310 00:11:23.624129 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 00:11:23 crc kubenswrapper[4906]: I0310 00:11:23.640208 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 00:11:23 crc kubenswrapper[4906]: I0310 00:11:23.728843 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 00:11:23 crc kubenswrapper[4906]: I0310 00:11:23.799063 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 00:11:23 crc kubenswrapper[4906]: I0310 00:11:23.906849 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 00:11:23 crc kubenswrapper[4906]: I0310 00:11:23.923061 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 00:11:24 crc kubenswrapper[4906]: I0310 00:11:24.113255 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 00:11:24 crc kubenswrapper[4906]: I0310 00:11:24.225000 4906 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 00:11:24 crc kubenswrapper[4906]: I0310 00:11:24.225395 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://fbf5da406f0ffae7503bc7940fdf80767f4e05d6974f27a267f036c83a678df4" gracePeriod=5 Mar 10 00:11:24 crc kubenswrapper[4906]: I0310 00:11:24.273356 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 00:11:24 crc kubenswrapper[4906]: I0310 00:11:24.370873 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 00:11:24 crc kubenswrapper[4906]: I0310 00:11:24.420708 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 00:11:24 crc kubenswrapper[4906]: I0310 00:11:24.439831 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 00:11:24 crc kubenswrapper[4906]: I0310 00:11:24.532481 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 00:11:24 crc kubenswrapper[4906]: I0310 00:11:24.598596 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 00:11:24 crc kubenswrapper[4906]: I0310 00:11:24.612499 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 00:11:24 crc kubenswrapper[4906]: I0310 00:11:24.680373 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 00:11:24 crc kubenswrapper[4906]: I0310 00:11:24.716439 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 00:11:24 crc kubenswrapper[4906]: I0310 00:11:24.823136 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 00:11:24 crc kubenswrapper[4906]: I0310 00:11:24.932159 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 00:11:25 crc kubenswrapper[4906]: I0310 00:11:25.015885 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 00:11:25 crc kubenswrapper[4906]: I0310 00:11:25.048204 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 00:11:25 crc kubenswrapper[4906]: I0310 00:11:25.142916 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 00:11:25 crc kubenswrapper[4906]: I0310 00:11:25.197525 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 00:11:25 crc kubenswrapper[4906]: I0310 00:11:25.216928 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 00:11:25 crc kubenswrapper[4906]: I0310 00:11:25.255719 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 00:11:25 crc kubenswrapper[4906]: I0310 00:11:25.549432 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 00:11:25 crc kubenswrapper[4906]: I0310 00:11:25.743672 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 00:11:25 crc kubenswrapper[4906]: I0310 00:11:25.799771 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 00:11:25 crc kubenswrapper[4906]: I0310 00:11:25.835536 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 00:11:26 crc kubenswrapper[4906]: I0310 00:11:26.119477 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 00:11:26 crc kubenswrapper[4906]: I0310 00:11:26.177147 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 00:11:26 crc kubenswrapper[4906]: I0310 00:11:26.222459 4906 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 00:11:26 crc kubenswrapper[4906]: I0310 00:11:26.354365 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 00:11:26 crc kubenswrapper[4906]: I0310 00:11:26.409862 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 00:11:26 crc kubenswrapper[4906]: I0310 00:11:26.420749 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 00:11:26 crc kubenswrapper[4906]: I0310 00:11:26.465795 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 00:11:26 crc kubenswrapper[4906]: I0310 00:11:26.495746 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 00:11:26 crc kubenswrapper[4906]: I0310 00:11:26.650253 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 00:11:26 crc kubenswrapper[4906]: I0310 00:11:26.669711 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 00:11:26 crc kubenswrapper[4906]: I0310 00:11:26.880903 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 00:11:26 crc kubenswrapper[4906]: I0310 00:11:26.982718 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 00:11:27 crc kubenswrapper[4906]: I0310 00:11:27.013712 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 00:11:27 crc kubenswrapper[4906]: I0310 00:11:27.071126 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 00:11:27 crc kubenswrapper[4906]: I0310 00:11:27.116510 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 00:11:27 crc kubenswrapper[4906]: I0310 00:11:27.239143 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 00:11:27 crc kubenswrapper[4906]: I0310 00:11:27.681240 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.794190 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.794517 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.821319 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.821361 4906 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="fbf5da406f0ffae7503bc7940fdf80767f4e05d6974f27a267f036c83a678df4" exitCode=137 Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.821403 4906 scope.go:117] "RemoveContainer" containerID="fbf5da406f0ffae7503bc7940fdf80767f4e05d6974f27a267f036c83a678df4" Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.821505 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.835279 4906 scope.go:117] "RemoveContainer" containerID="fbf5da406f0ffae7503bc7940fdf80767f4e05d6974f27a267f036c83a678df4" Mar 10 00:11:29 crc kubenswrapper[4906]: E0310 00:11:29.835802 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbf5da406f0ffae7503bc7940fdf80767f4e05d6974f27a267f036c83a678df4\": container with ID starting with fbf5da406f0ffae7503bc7940fdf80767f4e05d6974f27a267f036c83a678df4 not found: ID does not exist" containerID="fbf5da406f0ffae7503bc7940fdf80767f4e05d6974f27a267f036c83a678df4" Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.835914 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbf5da406f0ffae7503bc7940fdf80767f4e05d6974f27a267f036c83a678df4"} err="failed to get container status \"fbf5da406f0ffae7503bc7940fdf80767f4e05d6974f27a267f036c83a678df4\": rpc error: code = NotFound desc = could not find container \"fbf5da406f0ffae7503bc7940fdf80767f4e05d6974f27a267f036c83a678df4\": container with ID starting with fbf5da406f0ffae7503bc7940fdf80767f4e05d6974f27a267f036c83a678df4 not found: ID does not exist" Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.912025 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.912096 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.912131 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.912173 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.912214 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.912270 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.912707 4906 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.912780 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.912776 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.912837 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:11:29 crc kubenswrapper[4906]: I0310 00:11:29.929018 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:11:30 crc kubenswrapper[4906]: I0310 00:11:30.013711 4906 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:30 crc kubenswrapper[4906]: I0310 00:11:30.013745 4906 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:30 crc kubenswrapper[4906]: I0310 00:11:30.013757 4906 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:30 crc kubenswrapper[4906]: I0310 00:11:30.013768 4906 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 00:11:30 crc kubenswrapper[4906]: I0310 00:11:30.585573 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 10 00:11:46 crc kubenswrapper[4906]: I0310 00:11:46.484479 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 00:11:48 crc kubenswrapper[4906]: I0310 00:11:48.945264 4906 generic.go:334] "Generic (PLEG): container finished" podID="a46863f6-02e5-4d35-8b59-216377e41403" containerID="e69f1b587cd2d25069aa13c3a790a609e50976e1cfe2d22eddc8253bb52babc2" exitCode=0 Mar 10 00:11:48 crc kubenswrapper[4906]: I0310 00:11:48.945380 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" event={"ID":"a46863f6-02e5-4d35-8b59-216377e41403","Type":"ContainerDied","Data":"e69f1b587cd2d25069aa13c3a790a609e50976e1cfe2d22eddc8253bb52babc2"} Mar 10 00:11:48 crc kubenswrapper[4906]: I0310 00:11:48.946257 4906 scope.go:117] "RemoveContainer" containerID="e69f1b587cd2d25069aa13c3a790a609e50976e1cfe2d22eddc8253bb52babc2" Mar 10 00:11:49 crc kubenswrapper[4906]: I0310 00:11:49.951973 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" event={"ID":"a46863f6-02e5-4d35-8b59-216377e41403","Type":"ContainerStarted","Data":"741ebaa389c06f38a4b78e685b3aa3354c8737a01f690d60aac9ce438ec5127a"} Mar 10 00:11:49 crc kubenswrapper[4906]: I0310 00:11:49.952371 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" Mar 10 00:11:49 crc kubenswrapper[4906]: I0310 00:11:49.954004 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" Mar 10 00:12:00 crc kubenswrapper[4906]: I0310 00:12:00.194033 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551692-zfzmz"] Mar 10 00:12:00 crc kubenswrapper[4906]: E0310 00:12:00.194929 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 00:12:00 crc kubenswrapper[4906]: I0310 00:12:00.194950 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 00:12:00 crc kubenswrapper[4906]: E0310 00:12:00.194977 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc7eabb-e49f-497c-912d-f997514651c5" containerName="installer" Mar 10 00:12:00 crc kubenswrapper[4906]: I0310 00:12:00.194991 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc7eabb-e49f-497c-912d-f997514651c5" containerName="installer" Mar 10 00:12:00 crc kubenswrapper[4906]: I0310 00:12:00.195163 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 00:12:00 crc kubenswrapper[4906]: I0310 00:12:00.195190 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffc7eabb-e49f-497c-912d-f997514651c5" containerName="installer" Mar 10 00:12:00 crc kubenswrapper[4906]: I0310 00:12:00.195769 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551692-zfzmz" Mar 10 00:12:00 crc kubenswrapper[4906]: I0310 00:12:00.198459 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:12:00 crc kubenswrapper[4906]: I0310 00:12:00.200870 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:12:00 crc kubenswrapper[4906]: I0310 00:12:00.201027 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ktdr6" Mar 10 00:12:00 crc kubenswrapper[4906]: I0310 00:12:00.201433 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551692-zfzmz"] Mar 10 00:12:00 crc kubenswrapper[4906]: I0310 00:12:00.357264 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vhxv\" (UniqueName: \"kubernetes.io/projected/580d90a8-af39-47f0-81d8-301d64c29a1c-kube-api-access-7vhxv\") pod \"auto-csr-approver-29551692-zfzmz\" (UID: \"580d90a8-af39-47f0-81d8-301d64c29a1c\") " pod="openshift-infra/auto-csr-approver-29551692-zfzmz" Mar 10 00:12:00 crc kubenswrapper[4906]: I0310 00:12:00.458065 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vhxv\" (UniqueName: \"kubernetes.io/projected/580d90a8-af39-47f0-81d8-301d64c29a1c-kube-api-access-7vhxv\") pod \"auto-csr-approver-29551692-zfzmz\" (UID: \"580d90a8-af39-47f0-81d8-301d64c29a1c\") " pod="openshift-infra/auto-csr-approver-29551692-zfzmz" Mar 10 00:12:00 crc kubenswrapper[4906]: I0310 00:12:00.493954 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vhxv\" (UniqueName: \"kubernetes.io/projected/580d90a8-af39-47f0-81d8-301d64c29a1c-kube-api-access-7vhxv\") pod \"auto-csr-approver-29551692-zfzmz\" (UID: \"580d90a8-af39-47f0-81d8-301d64c29a1c\") " pod="openshift-infra/auto-csr-approver-29551692-zfzmz" Mar 10 00:12:00 crc kubenswrapper[4906]: I0310 00:12:00.518962 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551692-zfzmz" Mar 10 00:12:00 crc kubenswrapper[4906]: I0310 00:12:00.980596 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551692-zfzmz"] Mar 10 00:12:00 crc kubenswrapper[4906]: W0310 00:12:00.988911 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod580d90a8_af39_47f0_81d8_301d64c29a1c.slice/crio-af6e7d8a8f398659a48ad21d1e7a00195c88b9b4bc33d30d00f39fa6266af6fd WatchSource:0}: Error finding container af6e7d8a8f398659a48ad21d1e7a00195c88b9b4bc33d30d00f39fa6266af6fd: Status 404 returned error can't find the container with id af6e7d8a8f398659a48ad21d1e7a00195c88b9b4bc33d30d00f39fa6266af6fd Mar 10 00:12:01 crc kubenswrapper[4906]: I0310 00:12:01.017023 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551692-zfzmz" event={"ID":"580d90a8-af39-47f0-81d8-301d64c29a1c","Type":"ContainerStarted","Data":"af6e7d8a8f398659a48ad21d1e7a00195c88b9b4bc33d30d00f39fa6266af6fd"} Mar 10 00:12:03 crc kubenswrapper[4906]: I0310 00:12:03.033547 4906 generic.go:334] "Generic (PLEG): container finished" podID="580d90a8-af39-47f0-81d8-301d64c29a1c" containerID="4968c20beb7d23329043595ef7a842149b3cd4cc66f3f5a5cd7df53b844ba1df" exitCode=0 Mar 10 00:12:03 crc kubenswrapper[4906]: I0310 00:12:03.033686 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551692-zfzmz" event={"ID":"580d90a8-af39-47f0-81d8-301d64c29a1c","Type":"ContainerDied","Data":"4968c20beb7d23329043595ef7a842149b3cd4cc66f3f5a5cd7df53b844ba1df"} Mar 10 00:12:04 crc kubenswrapper[4906]: I0310 00:12:04.471876 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551692-zfzmz" Mar 10 00:12:04 crc kubenswrapper[4906]: I0310 00:12:04.619719 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vhxv\" (UniqueName: \"kubernetes.io/projected/580d90a8-af39-47f0-81d8-301d64c29a1c-kube-api-access-7vhxv\") pod \"580d90a8-af39-47f0-81d8-301d64c29a1c\" (UID: \"580d90a8-af39-47f0-81d8-301d64c29a1c\") " Mar 10 00:12:04 crc kubenswrapper[4906]: I0310 00:12:04.628204 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/580d90a8-af39-47f0-81d8-301d64c29a1c-kube-api-access-7vhxv" (OuterVolumeSpecName: "kube-api-access-7vhxv") pod "580d90a8-af39-47f0-81d8-301d64c29a1c" (UID: "580d90a8-af39-47f0-81d8-301d64c29a1c"). InnerVolumeSpecName "kube-api-access-7vhxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:12:04 crc kubenswrapper[4906]: I0310 00:12:04.721935 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vhxv\" (UniqueName: \"kubernetes.io/projected/580d90a8-af39-47f0-81d8-301d64c29a1c-kube-api-access-7vhxv\") on node \"crc\" DevicePath \"\"" Mar 10 00:12:05 crc kubenswrapper[4906]: I0310 00:12:05.050459 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551692-zfzmz" event={"ID":"580d90a8-af39-47f0-81d8-301d64c29a1c","Type":"ContainerDied","Data":"af6e7d8a8f398659a48ad21d1e7a00195c88b9b4bc33d30d00f39fa6266af6fd"} Mar 10 00:12:05 crc kubenswrapper[4906]: I0310 00:12:05.050513 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af6e7d8a8f398659a48ad21d1e7a00195c88b9b4bc33d30d00f39fa6266af6fd" Mar 10 00:12:05 crc kubenswrapper[4906]: I0310 00:12:05.050557 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551692-zfzmz" Mar 10 00:12:07 crc kubenswrapper[4906]: I0310 00:12:07.892869 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.395516 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nvvhx"] Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.397095 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nvvhx" podUID="dc4d2e8f-54ca-464b-b186-432747b22864" containerName="registry-server" containerID="cri-o://d555d1b48bb1766a9f43cd2a2627345cd232c241457ed488eb1f7b70564cc834" gracePeriod=30 Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.414011 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t65x9"] Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.414262 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t65x9" podUID="5331074a-1c86-455a-80e9-6f945936e218" containerName="registry-server" containerID="cri-o://4af373b75c87b9e13e9f5b3ce8a670161ab21c49d5861a970763f95a4475c871" gracePeriod=30 Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.426111 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vw9rg"] Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.426321 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" podUID="a46863f6-02e5-4d35-8b59-216377e41403" containerName="marketplace-operator" containerID="cri-o://741ebaa389c06f38a4b78e685b3aa3354c8737a01f690d60aac9ce438ec5127a" gracePeriod=30 Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.444134 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-whmcg"] Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.444439 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-whmcg" podUID="f1beb2f4-c1c5-488d-8c76-bed30174a0de" containerName="registry-server" containerID="cri-o://c75b6e3143fcef88791c1dfe2ebcf30b7746c0039a8d03cf57565dae7345196a" gracePeriod=30 Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.459905 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bdv5q"] Mar 10 00:12:44 crc kubenswrapper[4906]: E0310 00:12:44.460297 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="580d90a8-af39-47f0-81d8-301d64c29a1c" containerName="oc" Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.460325 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="580d90a8-af39-47f0-81d8-301d64c29a1c" containerName="oc" Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.460471 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="580d90a8-af39-47f0-81d8-301d64c29a1c" containerName="oc" Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.461023 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bdv5q" Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.470613 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nwlr8"] Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.470870 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nwlr8" podUID="68c8b80a-0af0-46cb-8a57-a353444de9dc" containerName="registry-server" containerID="cri-o://22f829c1c6b4ebd7aa3a25c52f981aa5da2781bd3283551c47f8df2849453d92" gracePeriod=30 Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.471993 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bdv5q"] Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.640624 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2bafcdb0-094e-4426-96b0-c23d59d49da2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bdv5q\" (UID: \"2bafcdb0-094e-4426-96b0-c23d59d49da2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdv5q" Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.640733 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2bafcdb0-094e-4426-96b0-c23d59d49da2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bdv5q\" (UID: \"2bafcdb0-094e-4426-96b0-c23d59d49da2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdv5q" Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.640796 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmh5d\" (UniqueName: \"kubernetes.io/projected/2bafcdb0-094e-4426-96b0-c23d59d49da2-kube-api-access-hmh5d\") pod \"marketplace-operator-79b997595-bdv5q\" (UID: \"2bafcdb0-094e-4426-96b0-c23d59d49da2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdv5q" Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.742179 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmh5d\" (UniqueName: \"kubernetes.io/projected/2bafcdb0-094e-4426-96b0-c23d59d49da2-kube-api-access-hmh5d\") pod \"marketplace-operator-79b997595-bdv5q\" (UID: \"2bafcdb0-094e-4426-96b0-c23d59d49da2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdv5q" Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.742245 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2bafcdb0-094e-4426-96b0-c23d59d49da2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bdv5q\" (UID: \"2bafcdb0-094e-4426-96b0-c23d59d49da2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdv5q" Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.742297 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2bafcdb0-094e-4426-96b0-c23d59d49da2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bdv5q\" (UID: \"2bafcdb0-094e-4426-96b0-c23d59d49da2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdv5q" Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.747927 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2bafcdb0-094e-4426-96b0-c23d59d49da2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bdv5q\" (UID: \"2bafcdb0-094e-4426-96b0-c23d59d49da2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdv5q" Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.777928 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2bafcdb0-094e-4426-96b0-c23d59d49da2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bdv5q\" (UID: \"2bafcdb0-094e-4426-96b0-c23d59d49da2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdv5q" Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.789590 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmh5d\" (UniqueName: \"kubernetes.io/projected/2bafcdb0-094e-4426-96b0-c23d59d49da2-kube-api-access-hmh5d\") pod \"marketplace-operator-79b997595-bdv5q\" (UID: \"2bafcdb0-094e-4426-96b0-c23d59d49da2\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdv5q" Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.920816 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bdv5q" Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.931041 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whmcg" Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.933870 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t65x9" Mar 10 00:12:44 crc kubenswrapper[4906]: I0310 00:12:44.940973 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvvhx" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.049186 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfk87\" (UniqueName: \"kubernetes.io/projected/5331074a-1c86-455a-80e9-6f945936e218-kube-api-access-gfk87\") pod \"5331074a-1c86-455a-80e9-6f945936e218\" (UID: \"5331074a-1c86-455a-80e9-6f945936e218\") " Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.049230 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc4d2e8f-54ca-464b-b186-432747b22864-catalog-content\") pod \"dc4d2e8f-54ca-464b-b186-432747b22864\" (UID: \"dc4d2e8f-54ca-464b-b186-432747b22864\") " Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.049272 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc4d2e8f-54ca-464b-b186-432747b22864-utilities\") pod \"dc4d2e8f-54ca-464b-b186-432747b22864\" (UID: \"dc4d2e8f-54ca-464b-b186-432747b22864\") " Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.049306 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1beb2f4-c1c5-488d-8c76-bed30174a0de-utilities\") pod \"f1beb2f4-c1c5-488d-8c76-bed30174a0de\" (UID: \"f1beb2f4-c1c5-488d-8c76-bed30174a0de\") " Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.049340 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1beb2f4-c1c5-488d-8c76-bed30174a0de-catalog-content\") pod \"f1beb2f4-c1c5-488d-8c76-bed30174a0de\" (UID: \"f1beb2f4-c1c5-488d-8c76-bed30174a0de\") " Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.049396 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5331074a-1c86-455a-80e9-6f945936e218-catalog-content\") pod \"5331074a-1c86-455a-80e9-6f945936e218\" (UID: \"5331074a-1c86-455a-80e9-6f945936e218\") " Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.049502 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvbd9\" (UniqueName: \"kubernetes.io/projected/f1beb2f4-c1c5-488d-8c76-bed30174a0de-kube-api-access-zvbd9\") pod \"f1beb2f4-c1c5-488d-8c76-bed30174a0de\" (UID: \"f1beb2f4-c1c5-488d-8c76-bed30174a0de\") " Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.049531 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4f97\" (UniqueName: \"kubernetes.io/projected/dc4d2e8f-54ca-464b-b186-432747b22864-kube-api-access-j4f97\") pod \"dc4d2e8f-54ca-464b-b186-432747b22864\" (UID: \"dc4d2e8f-54ca-464b-b186-432747b22864\") " Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.049573 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5331074a-1c86-455a-80e9-6f945936e218-utilities\") pod \"5331074a-1c86-455a-80e9-6f945936e218\" (UID: \"5331074a-1c86-455a-80e9-6f945936e218\") " Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.050396 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc4d2e8f-54ca-464b-b186-432747b22864-utilities" (OuterVolumeSpecName: "utilities") pod "dc4d2e8f-54ca-464b-b186-432747b22864" (UID: "dc4d2e8f-54ca-464b-b186-432747b22864"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.051522 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5331074a-1c86-455a-80e9-6f945936e218-utilities" (OuterVolumeSpecName: "utilities") pod "5331074a-1c86-455a-80e9-6f945936e218" (UID: "5331074a-1c86-455a-80e9-6f945936e218"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.051597 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1beb2f4-c1c5-488d-8c76-bed30174a0de-utilities" (OuterVolumeSpecName: "utilities") pod "f1beb2f4-c1c5-488d-8c76-bed30174a0de" (UID: "f1beb2f4-c1c5-488d-8c76-bed30174a0de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.054672 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc4d2e8f-54ca-464b-b186-432747b22864-kube-api-access-j4f97" (OuterVolumeSpecName: "kube-api-access-j4f97") pod "dc4d2e8f-54ca-464b-b186-432747b22864" (UID: "dc4d2e8f-54ca-464b-b186-432747b22864"). InnerVolumeSpecName "kube-api-access-j4f97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.055847 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1beb2f4-c1c5-488d-8c76-bed30174a0de-kube-api-access-zvbd9" (OuterVolumeSpecName: "kube-api-access-zvbd9") pod "f1beb2f4-c1c5-488d-8c76-bed30174a0de" (UID: "f1beb2f4-c1c5-488d-8c76-bed30174a0de"). InnerVolumeSpecName "kube-api-access-zvbd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.060906 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5331074a-1c86-455a-80e9-6f945936e218-kube-api-access-gfk87" (OuterVolumeSpecName: "kube-api-access-gfk87") pod "5331074a-1c86-455a-80e9-6f945936e218" (UID: "5331074a-1c86-455a-80e9-6f945936e218"). InnerVolumeSpecName "kube-api-access-gfk87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.132265 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc4d2e8f-54ca-464b-b186-432747b22864-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc4d2e8f-54ca-464b-b186-432747b22864" (UID: "dc4d2e8f-54ca-464b-b186-432747b22864"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.132296 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5331074a-1c86-455a-80e9-6f945936e218-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5331074a-1c86-455a-80e9-6f945936e218" (UID: "5331074a-1c86-455a-80e9-6f945936e218"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.133807 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1beb2f4-c1c5-488d-8c76-bed30174a0de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1beb2f4-c1c5-488d-8c76-bed30174a0de" (UID: "f1beb2f4-c1c5-488d-8c76-bed30174a0de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.151391 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1beb2f4-c1c5-488d-8c76-bed30174a0de-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.151440 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1beb2f4-c1c5-488d-8c76-bed30174a0de-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.151472 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5331074a-1c86-455a-80e9-6f945936e218-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.151485 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvbd9\" (UniqueName: \"kubernetes.io/projected/f1beb2f4-c1c5-488d-8c76-bed30174a0de-kube-api-access-zvbd9\") on node \"crc\" DevicePath \"\"" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.151497 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4f97\" (UniqueName: \"kubernetes.io/projected/dc4d2e8f-54ca-464b-b186-432747b22864-kube-api-access-j4f97\") on node \"crc\" DevicePath \"\"" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.151505 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5331074a-1c86-455a-80e9-6f945936e218-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.151517 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfk87\" (UniqueName: \"kubernetes.io/projected/5331074a-1c86-455a-80e9-6f945936e218-kube-api-access-gfk87\") on node \"crc\" DevicePath \"\"" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.151527 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc4d2e8f-54ca-464b-b186-432747b22864-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.151554 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc4d2e8f-54ca-464b-b186-432747b22864-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.282228 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.316096 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwlr8" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.324989 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bdv5q"] Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.353089 4906 generic.go:334] "Generic (PLEG): container finished" podID="dc4d2e8f-54ca-464b-b186-432747b22864" containerID="d555d1b48bb1766a9f43cd2a2627345cd232c241457ed488eb1f7b70564cc834" exitCode=0 Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.353135 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvvhx" event={"ID":"dc4d2e8f-54ca-464b-b186-432747b22864","Type":"ContainerDied","Data":"d555d1b48bb1766a9f43cd2a2627345cd232c241457ed488eb1f7b70564cc834"} Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.353189 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nvvhx" event={"ID":"dc4d2e8f-54ca-464b-b186-432747b22864","Type":"ContainerDied","Data":"90e818fe2518b16ed23dc1742efc9c1c19ed36092fcfa2efd1dd7db2ca51273c"} Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.353212 4906 scope.go:117] "RemoveContainer" containerID="d555d1b48bb1766a9f43cd2a2627345cd232c241457ed488eb1f7b70564cc834" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.353277 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nvvhx" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.354970 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c8b80a-0af0-46cb-8a57-a353444de9dc-utilities\") pod \"68c8b80a-0af0-46cb-8a57-a353444de9dc\" (UID: \"68c8b80a-0af0-46cb-8a57-a353444de9dc\") " Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.355002 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c8b80a-0af0-46cb-8a57-a353444de9dc-catalog-content\") pod \"68c8b80a-0af0-46cb-8a57-a353444de9dc\" (UID: \"68c8b80a-0af0-46cb-8a57-a353444de9dc\") " Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.355036 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7jhp\" (UniqueName: \"kubernetes.io/projected/a46863f6-02e5-4d35-8b59-216377e41403-kube-api-access-p7jhp\") pod \"a46863f6-02e5-4d35-8b59-216377e41403\" (UID: \"a46863f6-02e5-4d35-8b59-216377e41403\") " Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.355080 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x59lh\" (UniqueName: \"kubernetes.io/projected/68c8b80a-0af0-46cb-8a57-a353444de9dc-kube-api-access-x59lh\") pod \"68c8b80a-0af0-46cb-8a57-a353444de9dc\" (UID: \"68c8b80a-0af0-46cb-8a57-a353444de9dc\") " Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.355124 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a46863f6-02e5-4d35-8b59-216377e41403-marketplace-trusted-ca\") pod \"a46863f6-02e5-4d35-8b59-216377e41403\" (UID: \"a46863f6-02e5-4d35-8b59-216377e41403\") " Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.355144 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a46863f6-02e5-4d35-8b59-216377e41403-marketplace-operator-metrics\") pod \"a46863f6-02e5-4d35-8b59-216377e41403\" (UID: \"a46863f6-02e5-4d35-8b59-216377e41403\") " Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.357117 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68c8b80a-0af0-46cb-8a57-a353444de9dc-utilities" (OuterVolumeSpecName: "utilities") pod "68c8b80a-0af0-46cb-8a57-a353444de9dc" (UID: "68c8b80a-0af0-46cb-8a57-a353444de9dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.357735 4906 generic.go:334] "Generic (PLEG): container finished" podID="a46863f6-02e5-4d35-8b59-216377e41403" containerID="741ebaa389c06f38a4b78e685b3aa3354c8737a01f690d60aac9ce438ec5127a" exitCode=0 Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.357808 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" event={"ID":"a46863f6-02e5-4d35-8b59-216377e41403","Type":"ContainerDied","Data":"741ebaa389c06f38a4b78e685b3aa3354c8737a01f690d60aac9ce438ec5127a"} Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.357833 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" event={"ID":"a46863f6-02e5-4d35-8b59-216377e41403","Type":"ContainerDied","Data":"2a1ca1e77aa51ff73f567a9ff25f601106a3f2e3f7ca949b5e359c1ad6a71ba6"} Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.357742 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a46863f6-02e5-4d35-8b59-216377e41403-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "a46863f6-02e5-4d35-8b59-216377e41403" (UID: "a46863f6-02e5-4d35-8b59-216377e41403"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.357936 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vw9rg" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.359454 4906 generic.go:334] "Generic (PLEG): container finished" podID="68c8b80a-0af0-46cb-8a57-a353444de9dc" containerID="22f829c1c6b4ebd7aa3a25c52f981aa5da2781bd3283551c47f8df2849453d92" exitCode=0 Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.359493 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwlr8" event={"ID":"68c8b80a-0af0-46cb-8a57-a353444de9dc","Type":"ContainerDied","Data":"22f829c1c6b4ebd7aa3a25c52f981aa5da2781bd3283551c47f8df2849453d92"} Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.359510 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwlr8" event={"ID":"68c8b80a-0af0-46cb-8a57-a353444de9dc","Type":"ContainerDied","Data":"d0c6d459d953c52e76f1e79e20d47afe31f745624dd8022ad25146bd1500b707"} Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.359560 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwlr8" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.379930 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a46863f6-02e5-4d35-8b59-216377e41403-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "a46863f6-02e5-4d35-8b59-216377e41403" (UID: "a46863f6-02e5-4d35-8b59-216377e41403"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.380099 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a46863f6-02e5-4d35-8b59-216377e41403-kube-api-access-p7jhp" (OuterVolumeSpecName: "kube-api-access-p7jhp") pod "a46863f6-02e5-4d35-8b59-216377e41403" (UID: "a46863f6-02e5-4d35-8b59-216377e41403"). InnerVolumeSpecName "kube-api-access-p7jhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.381508 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68c8b80a-0af0-46cb-8a57-a353444de9dc-kube-api-access-x59lh" (OuterVolumeSpecName: "kube-api-access-x59lh") pod "68c8b80a-0af0-46cb-8a57-a353444de9dc" (UID: "68c8b80a-0af0-46cb-8a57-a353444de9dc"). InnerVolumeSpecName "kube-api-access-x59lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.389743 4906 generic.go:334] "Generic (PLEG): container finished" podID="f1beb2f4-c1c5-488d-8c76-bed30174a0de" containerID="c75b6e3143fcef88791c1dfe2ebcf30b7746c0039a8d03cf57565dae7345196a" exitCode=0 Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.389831 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whmcg" event={"ID":"f1beb2f4-c1c5-488d-8c76-bed30174a0de","Type":"ContainerDied","Data":"c75b6e3143fcef88791c1dfe2ebcf30b7746c0039a8d03cf57565dae7345196a"} Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.389869 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-whmcg" event={"ID":"f1beb2f4-c1c5-488d-8c76-bed30174a0de","Type":"ContainerDied","Data":"ffe09a6914c2f34259169dbc5c27f70c5da0f786e780406e5c05938c5a78da4e"} Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.390186 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-whmcg" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.398019 4906 scope.go:117] "RemoveContainer" containerID="0a3bec7e1c35fcaae5318f83d776257e5b5d9ce3cb7961baf70b55652f175381" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.399079 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t65x9" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.399104 4906 generic.go:334] "Generic (PLEG): container finished" podID="5331074a-1c86-455a-80e9-6f945936e218" containerID="4af373b75c87b9e13e9f5b3ce8a670161ab21c49d5861a970763f95a4475c871" exitCode=0 Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.399234 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t65x9" event={"ID":"5331074a-1c86-455a-80e9-6f945936e218","Type":"ContainerDied","Data":"4af373b75c87b9e13e9f5b3ce8a670161ab21c49d5861a970763f95a4475c871"} Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.399277 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t65x9" event={"ID":"5331074a-1c86-455a-80e9-6f945936e218","Type":"ContainerDied","Data":"44f72a4b931b74aab5af064c0720b88108e9acdd89a9d3ca2815356148c420ef"} Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.407369 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nvvhx"] Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.411992 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nvvhx"] Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.419044 4906 scope.go:117] "RemoveContainer" containerID="eb3bfe73e4d142b168b2c628fb012ce231de3bc959b2e3e6f29d940caf292556" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.446863 4906 scope.go:117] "RemoveContainer" containerID="d555d1b48bb1766a9f43cd2a2627345cd232c241457ed488eb1f7b70564cc834" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.447277 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-whmcg"] Mar 10 00:12:45 crc kubenswrapper[4906]: E0310 00:12:45.449595 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d555d1b48bb1766a9f43cd2a2627345cd232c241457ed488eb1f7b70564cc834\": container with ID starting with d555d1b48bb1766a9f43cd2a2627345cd232c241457ed488eb1f7b70564cc834 not found: ID does not exist" containerID="d555d1b48bb1766a9f43cd2a2627345cd232c241457ed488eb1f7b70564cc834" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.449701 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d555d1b48bb1766a9f43cd2a2627345cd232c241457ed488eb1f7b70564cc834"} err="failed to get container status \"d555d1b48bb1766a9f43cd2a2627345cd232c241457ed488eb1f7b70564cc834\": rpc error: code = NotFound desc = could not find container \"d555d1b48bb1766a9f43cd2a2627345cd232c241457ed488eb1f7b70564cc834\": container with ID starting with d555d1b48bb1766a9f43cd2a2627345cd232c241457ed488eb1f7b70564cc834 not found: ID does not exist" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.449764 4906 scope.go:117] "RemoveContainer" containerID="0a3bec7e1c35fcaae5318f83d776257e5b5d9ce3cb7961baf70b55652f175381" Mar 10 00:12:45 crc kubenswrapper[4906]: E0310 00:12:45.450442 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a3bec7e1c35fcaae5318f83d776257e5b5d9ce3cb7961baf70b55652f175381\": container with ID starting with 0a3bec7e1c35fcaae5318f83d776257e5b5d9ce3cb7961baf70b55652f175381 not found: ID does not exist" containerID="0a3bec7e1c35fcaae5318f83d776257e5b5d9ce3cb7961baf70b55652f175381" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.450476 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a3bec7e1c35fcaae5318f83d776257e5b5d9ce3cb7961baf70b55652f175381"} err="failed to get container status \"0a3bec7e1c35fcaae5318f83d776257e5b5d9ce3cb7961baf70b55652f175381\": rpc error: code = NotFound desc = could not find container \"0a3bec7e1c35fcaae5318f83d776257e5b5d9ce3cb7961baf70b55652f175381\": container with ID starting with 0a3bec7e1c35fcaae5318f83d776257e5b5d9ce3cb7961baf70b55652f175381 not found: ID does not exist" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.450709 4906 scope.go:117] "RemoveContainer" containerID="eb3bfe73e4d142b168b2c628fb012ce231de3bc959b2e3e6f29d940caf292556" Mar 10 00:12:45 crc kubenswrapper[4906]: E0310 00:12:45.451053 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb3bfe73e4d142b168b2c628fb012ce231de3bc959b2e3e6f29d940caf292556\": container with ID starting with eb3bfe73e4d142b168b2c628fb012ce231de3bc959b2e3e6f29d940caf292556 not found: ID does not exist" containerID="eb3bfe73e4d142b168b2c628fb012ce231de3bc959b2e3e6f29d940caf292556" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.451088 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb3bfe73e4d142b168b2c628fb012ce231de3bc959b2e3e6f29d940caf292556"} err="failed to get container status \"eb3bfe73e4d142b168b2c628fb012ce231de3bc959b2e3e6f29d940caf292556\": rpc error: code = NotFound desc = could not find container \"eb3bfe73e4d142b168b2c628fb012ce231de3bc959b2e3e6f29d940caf292556\": container with ID starting with eb3bfe73e4d142b168b2c628fb012ce231de3bc959b2e3e6f29d940caf292556 not found: ID does not exist" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.451113 4906 scope.go:117] "RemoveContainer" containerID="741ebaa389c06f38a4b78e685b3aa3354c8737a01f690d60aac9ce438ec5127a" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.452686 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-whmcg"] Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.456808 4906 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a46863f6-02e5-4d35-8b59-216377e41403-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.456839 4906 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a46863f6-02e5-4d35-8b59-216377e41403-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.456855 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c8b80a-0af0-46cb-8a57-a353444de9dc-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.456870 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7jhp\" (UniqueName: \"kubernetes.io/projected/a46863f6-02e5-4d35-8b59-216377e41403-kube-api-access-p7jhp\") on node \"crc\" DevicePath \"\"" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.456889 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x59lh\" (UniqueName: \"kubernetes.io/projected/68c8b80a-0af0-46cb-8a57-a353444de9dc-kube-api-access-x59lh\") on node \"crc\" DevicePath \"\"" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.461654 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t65x9"] Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.470343 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t65x9"] Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.470385 4906 scope.go:117] "RemoveContainer" containerID="e69f1b587cd2d25069aa13c3a790a609e50976e1cfe2d22eddc8253bb52babc2" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.486428 4906 scope.go:117] "RemoveContainer" containerID="741ebaa389c06f38a4b78e685b3aa3354c8737a01f690d60aac9ce438ec5127a" Mar 10 00:12:45 crc kubenswrapper[4906]: E0310 00:12:45.486914 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"741ebaa389c06f38a4b78e685b3aa3354c8737a01f690d60aac9ce438ec5127a\": container with ID starting with 741ebaa389c06f38a4b78e685b3aa3354c8737a01f690d60aac9ce438ec5127a not found: ID does not exist" containerID="741ebaa389c06f38a4b78e685b3aa3354c8737a01f690d60aac9ce438ec5127a" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.486960 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"741ebaa389c06f38a4b78e685b3aa3354c8737a01f690d60aac9ce438ec5127a"} err="failed to get container status \"741ebaa389c06f38a4b78e685b3aa3354c8737a01f690d60aac9ce438ec5127a\": rpc error: code = NotFound desc = could not find container \"741ebaa389c06f38a4b78e685b3aa3354c8737a01f690d60aac9ce438ec5127a\": container with ID starting with 741ebaa389c06f38a4b78e685b3aa3354c8737a01f690d60aac9ce438ec5127a not found: ID does not exist" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.486994 4906 scope.go:117] "RemoveContainer" containerID="e69f1b587cd2d25069aa13c3a790a609e50976e1cfe2d22eddc8253bb52babc2" Mar 10 00:12:45 crc kubenswrapper[4906]: E0310 00:12:45.487335 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e69f1b587cd2d25069aa13c3a790a609e50976e1cfe2d22eddc8253bb52babc2\": container with ID starting with e69f1b587cd2d25069aa13c3a790a609e50976e1cfe2d22eddc8253bb52babc2 not found: ID does not exist" containerID="e69f1b587cd2d25069aa13c3a790a609e50976e1cfe2d22eddc8253bb52babc2" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.487368 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e69f1b587cd2d25069aa13c3a790a609e50976e1cfe2d22eddc8253bb52babc2"} err="failed to get container status \"e69f1b587cd2d25069aa13c3a790a609e50976e1cfe2d22eddc8253bb52babc2\": rpc error: code = NotFound desc = could not find container \"e69f1b587cd2d25069aa13c3a790a609e50976e1cfe2d22eddc8253bb52babc2\": container with ID starting with e69f1b587cd2d25069aa13c3a790a609e50976e1cfe2d22eddc8253bb52babc2 not found: ID does not exist" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.487397 4906 scope.go:117] "RemoveContainer" containerID="22f829c1c6b4ebd7aa3a25c52f981aa5da2781bd3283551c47f8df2849453d92" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.500826 4906 scope.go:117] "RemoveContainer" containerID="bdec8f9b1877ddde359a4f10d14b9aad9979f8d6d4c0d1b1a9d1a0282ca458d1" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.519696 4906 scope.go:117] "RemoveContainer" containerID="4c75d0ad3e643818b0755e7d1c3f1580b0dcb16b9627e9b8cf07dd1af7d05a29" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.535333 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68c8b80a-0af0-46cb-8a57-a353444de9dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68c8b80a-0af0-46cb-8a57-a353444de9dc" (UID: "68c8b80a-0af0-46cb-8a57-a353444de9dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.558425 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c8b80a-0af0-46cb-8a57-a353444de9dc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.566293 4906 scope.go:117] "RemoveContainer" containerID="22f829c1c6b4ebd7aa3a25c52f981aa5da2781bd3283551c47f8df2849453d92" Mar 10 00:12:45 crc kubenswrapper[4906]: E0310 00:12:45.566834 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22f829c1c6b4ebd7aa3a25c52f981aa5da2781bd3283551c47f8df2849453d92\": container with ID starting with 22f829c1c6b4ebd7aa3a25c52f981aa5da2781bd3283551c47f8df2849453d92 not found: ID does not exist" containerID="22f829c1c6b4ebd7aa3a25c52f981aa5da2781bd3283551c47f8df2849453d92" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.566868 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22f829c1c6b4ebd7aa3a25c52f981aa5da2781bd3283551c47f8df2849453d92"} err="failed to get container status \"22f829c1c6b4ebd7aa3a25c52f981aa5da2781bd3283551c47f8df2849453d92\": rpc error: code = NotFound desc = could not find container \"22f829c1c6b4ebd7aa3a25c52f981aa5da2781bd3283551c47f8df2849453d92\": container with ID starting with 22f829c1c6b4ebd7aa3a25c52f981aa5da2781bd3283551c47f8df2849453d92 not found: ID does not exist" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.566889 4906 scope.go:117] "RemoveContainer" containerID="bdec8f9b1877ddde359a4f10d14b9aad9979f8d6d4c0d1b1a9d1a0282ca458d1" Mar 10 00:12:45 crc kubenswrapper[4906]: E0310 00:12:45.567377 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdec8f9b1877ddde359a4f10d14b9aad9979f8d6d4c0d1b1a9d1a0282ca458d1\": container with ID starting with bdec8f9b1877ddde359a4f10d14b9aad9979f8d6d4c0d1b1a9d1a0282ca458d1 not found: ID does not exist" containerID="bdec8f9b1877ddde359a4f10d14b9aad9979f8d6d4c0d1b1a9d1a0282ca458d1" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.567422 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdec8f9b1877ddde359a4f10d14b9aad9979f8d6d4c0d1b1a9d1a0282ca458d1"} err="failed to get container status \"bdec8f9b1877ddde359a4f10d14b9aad9979f8d6d4c0d1b1a9d1a0282ca458d1\": rpc error: code = NotFound desc = could not find container \"bdec8f9b1877ddde359a4f10d14b9aad9979f8d6d4c0d1b1a9d1a0282ca458d1\": container with ID starting with bdec8f9b1877ddde359a4f10d14b9aad9979f8d6d4c0d1b1a9d1a0282ca458d1 not found: ID does not exist" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.567451 4906 scope.go:117] "RemoveContainer" containerID="4c75d0ad3e643818b0755e7d1c3f1580b0dcb16b9627e9b8cf07dd1af7d05a29" Mar 10 00:12:45 crc kubenswrapper[4906]: E0310 00:12:45.567936 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c75d0ad3e643818b0755e7d1c3f1580b0dcb16b9627e9b8cf07dd1af7d05a29\": container with ID starting with 4c75d0ad3e643818b0755e7d1c3f1580b0dcb16b9627e9b8cf07dd1af7d05a29 not found: ID does not exist" containerID="4c75d0ad3e643818b0755e7d1c3f1580b0dcb16b9627e9b8cf07dd1af7d05a29" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.567961 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c75d0ad3e643818b0755e7d1c3f1580b0dcb16b9627e9b8cf07dd1af7d05a29"} err="failed to get container status \"4c75d0ad3e643818b0755e7d1c3f1580b0dcb16b9627e9b8cf07dd1af7d05a29\": rpc error: code = NotFound desc = could not find container \"4c75d0ad3e643818b0755e7d1c3f1580b0dcb16b9627e9b8cf07dd1af7d05a29\": container with ID starting with 4c75d0ad3e643818b0755e7d1c3f1580b0dcb16b9627e9b8cf07dd1af7d05a29 not found: ID does not exist" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.567976 4906 scope.go:117] "RemoveContainer" containerID="c75b6e3143fcef88791c1dfe2ebcf30b7746c0039a8d03cf57565dae7345196a" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.589260 4906 scope.go:117] "RemoveContainer" containerID="711228789c558ea0d1afc95a96bf9def92762e27d2713a45164c7f89b91178b2" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.606340 4906 scope.go:117] "RemoveContainer" containerID="459c36aaee677af4ae9fca1a1132c151871bf6706b661310394ba474cb5bb9af" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.626255 4906 scope.go:117] "RemoveContainer" containerID="c75b6e3143fcef88791c1dfe2ebcf30b7746c0039a8d03cf57565dae7345196a" Mar 10 00:12:45 crc kubenswrapper[4906]: E0310 00:12:45.626811 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c75b6e3143fcef88791c1dfe2ebcf30b7746c0039a8d03cf57565dae7345196a\": container with ID starting with c75b6e3143fcef88791c1dfe2ebcf30b7746c0039a8d03cf57565dae7345196a not found: ID does not exist" containerID="c75b6e3143fcef88791c1dfe2ebcf30b7746c0039a8d03cf57565dae7345196a" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.626858 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c75b6e3143fcef88791c1dfe2ebcf30b7746c0039a8d03cf57565dae7345196a"} err="failed to get container status \"c75b6e3143fcef88791c1dfe2ebcf30b7746c0039a8d03cf57565dae7345196a\": rpc error: code = NotFound desc = could not find container \"c75b6e3143fcef88791c1dfe2ebcf30b7746c0039a8d03cf57565dae7345196a\": container with ID starting with c75b6e3143fcef88791c1dfe2ebcf30b7746c0039a8d03cf57565dae7345196a not found: ID does not exist" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.626889 4906 scope.go:117] "RemoveContainer" containerID="711228789c558ea0d1afc95a96bf9def92762e27d2713a45164c7f89b91178b2" Mar 10 00:12:45 crc kubenswrapper[4906]: E0310 00:12:45.627318 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"711228789c558ea0d1afc95a96bf9def92762e27d2713a45164c7f89b91178b2\": container with ID starting with 711228789c558ea0d1afc95a96bf9def92762e27d2713a45164c7f89b91178b2 not found: ID does not exist" containerID="711228789c558ea0d1afc95a96bf9def92762e27d2713a45164c7f89b91178b2" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.627372 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"711228789c558ea0d1afc95a96bf9def92762e27d2713a45164c7f89b91178b2"} err="failed to get container status \"711228789c558ea0d1afc95a96bf9def92762e27d2713a45164c7f89b91178b2\": rpc error: code = NotFound desc = could not find container \"711228789c558ea0d1afc95a96bf9def92762e27d2713a45164c7f89b91178b2\": container with ID starting with 711228789c558ea0d1afc95a96bf9def92762e27d2713a45164c7f89b91178b2 not found: ID does not exist" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.627403 4906 scope.go:117] "RemoveContainer" containerID="459c36aaee677af4ae9fca1a1132c151871bf6706b661310394ba474cb5bb9af" Mar 10 00:12:45 crc kubenswrapper[4906]: E0310 00:12:45.627799 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"459c36aaee677af4ae9fca1a1132c151871bf6706b661310394ba474cb5bb9af\": container with ID starting with 459c36aaee677af4ae9fca1a1132c151871bf6706b661310394ba474cb5bb9af not found: ID does not exist" containerID="459c36aaee677af4ae9fca1a1132c151871bf6706b661310394ba474cb5bb9af" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.627827 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"459c36aaee677af4ae9fca1a1132c151871bf6706b661310394ba474cb5bb9af"} err="failed to get container status \"459c36aaee677af4ae9fca1a1132c151871bf6706b661310394ba474cb5bb9af\": rpc error: code = NotFound desc = could not find container \"459c36aaee677af4ae9fca1a1132c151871bf6706b661310394ba474cb5bb9af\": container with ID starting with 459c36aaee677af4ae9fca1a1132c151871bf6706b661310394ba474cb5bb9af not found: ID does not exist" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.627843 4906 scope.go:117] "RemoveContainer" containerID="4af373b75c87b9e13e9f5b3ce8a670161ab21c49d5861a970763f95a4475c871" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.641908 4906 scope.go:117] "RemoveContainer" containerID="db247e3434e7832999a110107e7ad0e11c7fa548c68d84f4ebdd7bdc54438a1d" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.658399 4906 scope.go:117] "RemoveContainer" containerID="e57b6c2d2f8a97e0b7c0aa7eba8757eb9df4cb672e4128d3e93135e5f36c7d36" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.689856 4906 scope.go:117] "RemoveContainer" containerID="4af373b75c87b9e13e9f5b3ce8a670161ab21c49d5861a970763f95a4475c871" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.689986 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vw9rg"] Mar 10 00:12:45 crc kubenswrapper[4906]: E0310 00:12:45.691426 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4af373b75c87b9e13e9f5b3ce8a670161ab21c49d5861a970763f95a4475c871\": container with ID starting with 4af373b75c87b9e13e9f5b3ce8a670161ab21c49d5861a970763f95a4475c871 not found: ID does not exist" containerID="4af373b75c87b9e13e9f5b3ce8a670161ab21c49d5861a970763f95a4475c871" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.691466 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af373b75c87b9e13e9f5b3ce8a670161ab21c49d5861a970763f95a4475c871"} err="failed to get container status \"4af373b75c87b9e13e9f5b3ce8a670161ab21c49d5861a970763f95a4475c871\": rpc error: code = NotFound desc = could not find container \"4af373b75c87b9e13e9f5b3ce8a670161ab21c49d5861a970763f95a4475c871\": container with ID starting with 4af373b75c87b9e13e9f5b3ce8a670161ab21c49d5861a970763f95a4475c871 not found: ID does not exist" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.691490 4906 scope.go:117] "RemoveContainer" containerID="db247e3434e7832999a110107e7ad0e11c7fa548c68d84f4ebdd7bdc54438a1d" Mar 10 00:12:45 crc kubenswrapper[4906]: E0310 00:12:45.691938 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db247e3434e7832999a110107e7ad0e11c7fa548c68d84f4ebdd7bdc54438a1d\": container with ID starting with db247e3434e7832999a110107e7ad0e11c7fa548c68d84f4ebdd7bdc54438a1d not found: ID does not exist" containerID="db247e3434e7832999a110107e7ad0e11c7fa548c68d84f4ebdd7bdc54438a1d" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.691964 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db247e3434e7832999a110107e7ad0e11c7fa548c68d84f4ebdd7bdc54438a1d"} err="failed to get container status \"db247e3434e7832999a110107e7ad0e11c7fa548c68d84f4ebdd7bdc54438a1d\": rpc error: code = NotFound desc = could not find container \"db247e3434e7832999a110107e7ad0e11c7fa548c68d84f4ebdd7bdc54438a1d\": container with ID starting with db247e3434e7832999a110107e7ad0e11c7fa548c68d84f4ebdd7bdc54438a1d not found: ID does not exist" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.691978 4906 scope.go:117] "RemoveContainer" containerID="e57b6c2d2f8a97e0b7c0aa7eba8757eb9df4cb672e4128d3e93135e5f36c7d36" Mar 10 00:12:45 crc kubenswrapper[4906]: E0310 00:12:45.692259 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e57b6c2d2f8a97e0b7c0aa7eba8757eb9df4cb672e4128d3e93135e5f36c7d36\": container with ID starting with e57b6c2d2f8a97e0b7c0aa7eba8757eb9df4cb672e4128d3e93135e5f36c7d36 not found: ID does not exist" containerID="e57b6c2d2f8a97e0b7c0aa7eba8757eb9df4cb672e4128d3e93135e5f36c7d36" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.692306 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e57b6c2d2f8a97e0b7c0aa7eba8757eb9df4cb672e4128d3e93135e5f36c7d36"} err="failed to get container status \"e57b6c2d2f8a97e0b7c0aa7eba8757eb9df4cb672e4128d3e93135e5f36c7d36\": rpc error: code = NotFound desc = could not find container \"e57b6c2d2f8a97e0b7c0aa7eba8757eb9df4cb672e4128d3e93135e5f36c7d36\": container with ID starting with e57b6c2d2f8a97e0b7c0aa7eba8757eb9df4cb672e4128d3e93135e5f36c7d36 not found: ID does not exist" Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.694921 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vw9rg"] Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.703553 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nwlr8"] Mar 10 00:12:45 crc kubenswrapper[4906]: I0310 00:12:45.706590 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nwlr8"] Mar 10 00:12:46 crc kubenswrapper[4906]: I0310 00:12:46.409350 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bdv5q" event={"ID":"2bafcdb0-094e-4426-96b0-c23d59d49da2","Type":"ContainerStarted","Data":"fc7ac3da446c10cb4ffd413d0fe3bdefa4257e06dd17b2077ee2aea5fb630142"} Mar 10 00:12:46 crc kubenswrapper[4906]: I0310 00:12:46.409978 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bdv5q" Mar 10 00:12:46 crc kubenswrapper[4906]: I0310 00:12:46.410004 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bdv5q" event={"ID":"2bafcdb0-094e-4426-96b0-c23d59d49da2","Type":"ContainerStarted","Data":"93fb00c200011f10de74dc28e0d286b518f9d485a434b1b2247f63d4474a1ca2"} Mar 10 00:12:46 crc kubenswrapper[4906]: I0310 00:12:46.417709 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bdv5q" Mar 10 00:12:46 crc kubenswrapper[4906]: I0310 00:12:46.431777 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bdv5q" podStartSLOduration=2.431754353 podStartE2EDuration="2.431754353s" podCreationTimestamp="2026-03-10 00:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:12:46.429542461 +0000 UTC m=+392.577437573" watchObservedRunningTime="2026-03-10 00:12:46.431754353 +0000 UTC m=+392.579649505" Mar 10 00:12:46 crc kubenswrapper[4906]: I0310 00:12:46.582257 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5331074a-1c86-455a-80e9-6f945936e218" path="/var/lib/kubelet/pods/5331074a-1c86-455a-80e9-6f945936e218/volumes" Mar 10 00:12:46 crc kubenswrapper[4906]: I0310 00:12:46.582870 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68c8b80a-0af0-46cb-8a57-a353444de9dc" path="/var/lib/kubelet/pods/68c8b80a-0af0-46cb-8a57-a353444de9dc/volumes" Mar 10 00:12:46 crc kubenswrapper[4906]: I0310 00:12:46.583438 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a46863f6-02e5-4d35-8b59-216377e41403" path="/var/lib/kubelet/pods/a46863f6-02e5-4d35-8b59-216377e41403/volumes" Mar 10 00:12:46 crc kubenswrapper[4906]: I0310 00:12:46.583916 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc4d2e8f-54ca-464b-b186-432747b22864" path="/var/lib/kubelet/pods/dc4d2e8f-54ca-464b-b186-432747b22864/volumes" Mar 10 00:12:46 crc kubenswrapper[4906]: I0310 00:12:46.584582 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1beb2f4-c1c5-488d-8c76-bed30174a0de" path="/var/lib/kubelet/pods/f1beb2f4-c1c5-488d-8c76-bed30174a0de/volumes" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405190 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2zhb6"] Mar 10 00:12:47 crc kubenswrapper[4906]: E0310 00:12:47.405378 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4d2e8f-54ca-464b-b186-432747b22864" containerName="extract-content" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405389 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4d2e8f-54ca-464b-b186-432747b22864" containerName="extract-content" Mar 10 00:12:47 crc kubenswrapper[4906]: E0310 00:12:47.405397 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4d2e8f-54ca-464b-b186-432747b22864" containerName="registry-server" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405403 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4d2e8f-54ca-464b-b186-432747b22864" containerName="registry-server" Mar 10 00:12:47 crc kubenswrapper[4906]: E0310 00:12:47.405411 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1beb2f4-c1c5-488d-8c76-bed30174a0de" containerName="registry-server" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405418 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1beb2f4-c1c5-488d-8c76-bed30174a0de" containerName="registry-server" Mar 10 00:12:47 crc kubenswrapper[4906]: E0310 00:12:47.405428 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1beb2f4-c1c5-488d-8c76-bed30174a0de" containerName="extract-content" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405434 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1beb2f4-c1c5-488d-8c76-bed30174a0de" containerName="extract-content" Mar 10 00:12:47 crc kubenswrapper[4906]: E0310 00:12:47.405449 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c8b80a-0af0-46cb-8a57-a353444de9dc" containerName="registry-server" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405456 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c8b80a-0af0-46cb-8a57-a353444de9dc" containerName="registry-server" Mar 10 00:12:47 crc kubenswrapper[4906]: E0310 00:12:47.405466 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c8b80a-0af0-46cb-8a57-a353444de9dc" containerName="extract-utilities" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405474 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c8b80a-0af0-46cb-8a57-a353444de9dc" containerName="extract-utilities" Mar 10 00:12:47 crc kubenswrapper[4906]: E0310 00:12:47.405483 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1beb2f4-c1c5-488d-8c76-bed30174a0de" containerName="extract-utilities" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405489 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1beb2f4-c1c5-488d-8c76-bed30174a0de" containerName="extract-utilities" Mar 10 00:12:47 crc kubenswrapper[4906]: E0310 00:12:47.405497 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5331074a-1c86-455a-80e9-6f945936e218" containerName="extract-utilities" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405503 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="5331074a-1c86-455a-80e9-6f945936e218" containerName="extract-utilities" Mar 10 00:12:47 crc kubenswrapper[4906]: E0310 00:12:47.405511 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5331074a-1c86-455a-80e9-6f945936e218" containerName="extract-content" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405516 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="5331074a-1c86-455a-80e9-6f945936e218" containerName="extract-content" Mar 10 00:12:47 crc kubenswrapper[4906]: E0310 00:12:47.405524 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c8b80a-0af0-46cb-8a57-a353444de9dc" containerName="extract-content" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405529 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c8b80a-0af0-46cb-8a57-a353444de9dc" containerName="extract-content" Mar 10 00:12:47 crc kubenswrapper[4906]: E0310 00:12:47.405537 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4d2e8f-54ca-464b-b186-432747b22864" containerName="extract-utilities" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405543 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4d2e8f-54ca-464b-b186-432747b22864" containerName="extract-utilities" Mar 10 00:12:47 crc kubenswrapper[4906]: E0310 00:12:47.405551 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5331074a-1c86-455a-80e9-6f945936e218" containerName="registry-server" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405557 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="5331074a-1c86-455a-80e9-6f945936e218" containerName="registry-server" Mar 10 00:12:47 crc kubenswrapper[4906]: E0310 00:12:47.405565 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a46863f6-02e5-4d35-8b59-216377e41403" containerName="marketplace-operator" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405571 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="a46863f6-02e5-4d35-8b59-216377e41403" containerName="marketplace-operator" Mar 10 00:12:47 crc kubenswrapper[4906]: E0310 00:12:47.405578 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a46863f6-02e5-4d35-8b59-216377e41403" containerName="marketplace-operator" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405584 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="a46863f6-02e5-4d35-8b59-216377e41403" containerName="marketplace-operator" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405677 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="a46863f6-02e5-4d35-8b59-216377e41403" containerName="marketplace-operator" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405686 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="a46863f6-02e5-4d35-8b59-216377e41403" containerName="marketplace-operator" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405694 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="68c8b80a-0af0-46cb-8a57-a353444de9dc" containerName="registry-server" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405704 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1beb2f4-c1c5-488d-8c76-bed30174a0de" containerName="registry-server" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405714 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4d2e8f-54ca-464b-b186-432747b22864" containerName="registry-server" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.405719 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="5331074a-1c86-455a-80e9-6f945936e218" containerName="registry-server" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.406352 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2zhb6" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.408195 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.418099 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2zhb6"] Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.584336 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdcsm\" (UniqueName: \"kubernetes.io/projected/2b962aff-3926-4d40-b95f-ea1c8062ede2-kube-api-access-hdcsm\") pod \"certified-operators-2zhb6\" (UID: \"2b962aff-3926-4d40-b95f-ea1c8062ede2\") " pod="openshift-marketplace/certified-operators-2zhb6" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.584687 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b962aff-3926-4d40-b95f-ea1c8062ede2-utilities\") pod \"certified-operators-2zhb6\" (UID: \"2b962aff-3926-4d40-b95f-ea1c8062ede2\") " pod="openshift-marketplace/certified-operators-2zhb6" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.584773 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b962aff-3926-4d40-b95f-ea1c8062ede2-catalog-content\") pod \"certified-operators-2zhb6\" (UID: \"2b962aff-3926-4d40-b95f-ea1c8062ede2\") " pod="openshift-marketplace/certified-operators-2zhb6" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.686253 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b962aff-3926-4d40-b95f-ea1c8062ede2-catalog-content\") pod \"certified-operators-2zhb6\" (UID: \"2b962aff-3926-4d40-b95f-ea1c8062ede2\") " pod="openshift-marketplace/certified-operators-2zhb6" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.686336 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdcsm\" (UniqueName: \"kubernetes.io/projected/2b962aff-3926-4d40-b95f-ea1c8062ede2-kube-api-access-hdcsm\") pod \"certified-operators-2zhb6\" (UID: \"2b962aff-3926-4d40-b95f-ea1c8062ede2\") " pod="openshift-marketplace/certified-operators-2zhb6" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.686368 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b962aff-3926-4d40-b95f-ea1c8062ede2-utilities\") pod \"certified-operators-2zhb6\" (UID: \"2b962aff-3926-4d40-b95f-ea1c8062ede2\") " pod="openshift-marketplace/certified-operators-2zhb6" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.686760 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b962aff-3926-4d40-b95f-ea1c8062ede2-catalog-content\") pod \"certified-operators-2zhb6\" (UID: \"2b962aff-3926-4d40-b95f-ea1c8062ede2\") " pod="openshift-marketplace/certified-operators-2zhb6" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.686997 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b962aff-3926-4d40-b95f-ea1c8062ede2-utilities\") pod \"certified-operators-2zhb6\" (UID: \"2b962aff-3926-4d40-b95f-ea1c8062ede2\") " pod="openshift-marketplace/certified-operators-2zhb6" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.704936 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdcsm\" (UniqueName: \"kubernetes.io/projected/2b962aff-3926-4d40-b95f-ea1c8062ede2-kube-api-access-hdcsm\") pod \"certified-operators-2zhb6\" (UID: \"2b962aff-3926-4d40-b95f-ea1c8062ede2\") " pod="openshift-marketplace/certified-operators-2zhb6" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.722189 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2zhb6" Mar 10 00:12:47 crc kubenswrapper[4906]: I0310 00:12:47.931312 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2zhb6"] Mar 10 00:12:47 crc kubenswrapper[4906]: W0310 00:12:47.941261 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b962aff_3926_4d40_b95f_ea1c8062ede2.slice/crio-e7dbc3665b1675103a1b09e39f145d26f26839c2998bc99ede682c0998856eeb WatchSource:0}: Error finding container e7dbc3665b1675103a1b09e39f145d26f26839c2998bc99ede682c0998856eeb: Status 404 returned error can't find the container with id e7dbc3665b1675103a1b09e39f145d26f26839c2998bc99ede682c0998856eeb Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.001800 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9x7g5"] Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.003072 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9x7g5" Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.004827 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.013028 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9x7g5"] Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.089944 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac5a82c2-3734-4064-bb05-2cf40dededee-utilities\") pod \"community-operators-9x7g5\" (UID: \"ac5a82c2-3734-4064-bb05-2cf40dededee\") " pod="openshift-marketplace/community-operators-9x7g5" Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.090009 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac5a82c2-3734-4064-bb05-2cf40dededee-catalog-content\") pod \"community-operators-9x7g5\" (UID: \"ac5a82c2-3734-4064-bb05-2cf40dededee\") " pod="openshift-marketplace/community-operators-9x7g5" Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.090125 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsjm2\" (UniqueName: \"kubernetes.io/projected/ac5a82c2-3734-4064-bb05-2cf40dededee-kube-api-access-tsjm2\") pod \"community-operators-9x7g5\" (UID: \"ac5a82c2-3734-4064-bb05-2cf40dededee\") " pod="openshift-marketplace/community-operators-9x7g5" Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.193246 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac5a82c2-3734-4064-bb05-2cf40dededee-utilities\") pod \"community-operators-9x7g5\" (UID: \"ac5a82c2-3734-4064-bb05-2cf40dededee\") " pod="openshift-marketplace/community-operators-9x7g5" Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.193331 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac5a82c2-3734-4064-bb05-2cf40dededee-catalog-content\") pod \"community-operators-9x7g5\" (UID: \"ac5a82c2-3734-4064-bb05-2cf40dededee\") " pod="openshift-marketplace/community-operators-9x7g5" Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.193382 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsjm2\" (UniqueName: \"kubernetes.io/projected/ac5a82c2-3734-4064-bb05-2cf40dededee-kube-api-access-tsjm2\") pod \"community-operators-9x7g5\" (UID: \"ac5a82c2-3734-4064-bb05-2cf40dededee\") " pod="openshift-marketplace/community-operators-9x7g5" Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.193938 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac5a82c2-3734-4064-bb05-2cf40dededee-catalog-content\") pod \"community-operators-9x7g5\" (UID: \"ac5a82c2-3734-4064-bb05-2cf40dededee\") " pod="openshift-marketplace/community-operators-9x7g5" Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.194314 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac5a82c2-3734-4064-bb05-2cf40dededee-utilities\") pod \"community-operators-9x7g5\" (UID: \"ac5a82c2-3734-4064-bb05-2cf40dededee\") " pod="openshift-marketplace/community-operators-9x7g5" Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.209982 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsjm2\" (UniqueName: \"kubernetes.io/projected/ac5a82c2-3734-4064-bb05-2cf40dededee-kube-api-access-tsjm2\") pod \"community-operators-9x7g5\" (UID: \"ac5a82c2-3734-4064-bb05-2cf40dededee\") " pod="openshift-marketplace/community-operators-9x7g5" Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.391112 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9x7g5" Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.436557 4906 generic.go:334] "Generic (PLEG): container finished" podID="2b962aff-3926-4d40-b95f-ea1c8062ede2" containerID="abc7b9c3c9f7683995a4b73ee67daaf921151262849515e9e811fe2f135ab139" exitCode=0 Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.436681 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zhb6" event={"ID":"2b962aff-3926-4d40-b95f-ea1c8062ede2","Type":"ContainerDied","Data":"abc7b9c3c9f7683995a4b73ee67daaf921151262849515e9e811fe2f135ab139"} Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.436785 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zhb6" event={"ID":"2b962aff-3926-4d40-b95f-ea1c8062ede2","Type":"ContainerStarted","Data":"e7dbc3665b1675103a1b09e39f145d26f26839c2998bc99ede682c0998856eeb"} Mar 10 00:12:48 crc kubenswrapper[4906]: I0310 00:12:48.657599 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9x7g5"] Mar 10 00:12:48 crc kubenswrapper[4906]: W0310 00:12:48.659187 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5a82c2_3734_4064_bb05_2cf40dededee.slice/crio-9aee14af355eff9ad3310b496020de357c93b5f1b099966c398b7b636c265fcb WatchSource:0}: Error finding container 9aee14af355eff9ad3310b496020de357c93b5f1b099966c398b7b636c265fcb: Status 404 returned error can't find the container with id 9aee14af355eff9ad3310b496020de357c93b5f1b099966c398b7b636c265fcb Mar 10 00:12:49 crc kubenswrapper[4906]: I0310 00:12:49.444979 4906 generic.go:334] "Generic (PLEG): container finished" podID="ac5a82c2-3734-4064-bb05-2cf40dededee" containerID="eb971d77b7b20be2bf1e2c60e7aff29fd0259ce2e2a560e713f184790dd4cdd9" exitCode=0 Mar 10 00:12:49 crc kubenswrapper[4906]: I0310 00:12:49.445199 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9x7g5" event={"ID":"ac5a82c2-3734-4064-bb05-2cf40dededee","Type":"ContainerDied","Data":"eb971d77b7b20be2bf1e2c60e7aff29fd0259ce2e2a560e713f184790dd4cdd9"} Mar 10 00:12:49 crc kubenswrapper[4906]: I0310 00:12:49.445393 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9x7g5" event={"ID":"ac5a82c2-3734-4064-bb05-2cf40dededee","Type":"ContainerStarted","Data":"9aee14af355eff9ad3310b496020de357c93b5f1b099966c398b7b636c265fcb"} Mar 10 00:12:49 crc kubenswrapper[4906]: I0310 00:12:49.448993 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zhb6" event={"ID":"2b962aff-3926-4d40-b95f-ea1c8062ede2","Type":"ContainerStarted","Data":"baa36ca52f3ef352457161870100fc49e651f4e3e744f0bf9884ff9df26fc073"} Mar 10 00:12:49 crc kubenswrapper[4906]: I0310 00:12:49.804748 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4cfdg"] Mar 10 00:12:49 crc kubenswrapper[4906]: I0310 00:12:49.807257 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4cfdg" Mar 10 00:12:49 crc kubenswrapper[4906]: I0310 00:12:49.812000 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 00:12:49 crc kubenswrapper[4906]: I0310 00:12:49.816786 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4cfdg"] Mar 10 00:12:49 crc kubenswrapper[4906]: I0310 00:12:49.827288 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0161a078-8da8-4080-bd86-1f8adfd0b57c-catalog-content\") pod \"redhat-marketplace-4cfdg\" (UID: \"0161a078-8da8-4080-bd86-1f8adfd0b57c\") " pod="openshift-marketplace/redhat-marketplace-4cfdg" Mar 10 00:12:49 crc kubenswrapper[4906]: I0310 00:12:49.827342 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0161a078-8da8-4080-bd86-1f8adfd0b57c-utilities\") pod \"redhat-marketplace-4cfdg\" (UID: \"0161a078-8da8-4080-bd86-1f8adfd0b57c\") " pod="openshift-marketplace/redhat-marketplace-4cfdg" Mar 10 00:12:49 crc kubenswrapper[4906]: I0310 00:12:49.827411 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbv5p\" (UniqueName: \"kubernetes.io/projected/0161a078-8da8-4080-bd86-1f8adfd0b57c-kube-api-access-gbv5p\") pod \"redhat-marketplace-4cfdg\" (UID: \"0161a078-8da8-4080-bd86-1f8adfd0b57c\") " pod="openshift-marketplace/redhat-marketplace-4cfdg" Mar 10 00:12:49 crc kubenswrapper[4906]: I0310 00:12:49.928782 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0161a078-8da8-4080-bd86-1f8adfd0b57c-catalog-content\") pod \"redhat-marketplace-4cfdg\" (UID: \"0161a078-8da8-4080-bd86-1f8adfd0b57c\") " pod="openshift-marketplace/redhat-marketplace-4cfdg" Mar 10 00:12:49 crc kubenswrapper[4906]: I0310 00:12:49.928828 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0161a078-8da8-4080-bd86-1f8adfd0b57c-utilities\") pod \"redhat-marketplace-4cfdg\" (UID: \"0161a078-8da8-4080-bd86-1f8adfd0b57c\") " pod="openshift-marketplace/redhat-marketplace-4cfdg" Mar 10 00:12:49 crc kubenswrapper[4906]: I0310 00:12:49.928888 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbv5p\" (UniqueName: \"kubernetes.io/projected/0161a078-8da8-4080-bd86-1f8adfd0b57c-kube-api-access-gbv5p\") pod \"redhat-marketplace-4cfdg\" (UID: \"0161a078-8da8-4080-bd86-1f8adfd0b57c\") " pod="openshift-marketplace/redhat-marketplace-4cfdg" Mar 10 00:12:49 crc kubenswrapper[4906]: I0310 00:12:49.929275 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0161a078-8da8-4080-bd86-1f8adfd0b57c-catalog-content\") pod \"redhat-marketplace-4cfdg\" (UID: \"0161a078-8da8-4080-bd86-1f8adfd0b57c\") " pod="openshift-marketplace/redhat-marketplace-4cfdg" Mar 10 00:12:49 crc kubenswrapper[4906]: I0310 00:12:49.929706 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0161a078-8da8-4080-bd86-1f8adfd0b57c-utilities\") pod \"redhat-marketplace-4cfdg\" (UID: \"0161a078-8da8-4080-bd86-1f8adfd0b57c\") " pod="openshift-marketplace/redhat-marketplace-4cfdg" Mar 10 00:12:49 crc kubenswrapper[4906]: I0310 00:12:49.979089 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbv5p\" (UniqueName: \"kubernetes.io/projected/0161a078-8da8-4080-bd86-1f8adfd0b57c-kube-api-access-gbv5p\") pod \"redhat-marketplace-4cfdg\" (UID: \"0161a078-8da8-4080-bd86-1f8adfd0b57c\") " pod="openshift-marketplace/redhat-marketplace-4cfdg" Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.131231 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4cfdg" Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.353533 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4cfdg"] Mar 10 00:12:50 crc kubenswrapper[4906]: W0310 00:12:50.364341 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0161a078_8da8_4080_bd86_1f8adfd0b57c.slice/crio-933e4415816dbf10cf8f3c3fe9658c7636764303a253e905a19e9de76ddb9de9 WatchSource:0}: Error finding container 933e4415816dbf10cf8f3c3fe9658c7636764303a253e905a19e9de76ddb9de9: Status 404 returned error can't find the container with id 933e4415816dbf10cf8f3c3fe9658c7636764303a253e905a19e9de76ddb9de9 Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.455784 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cfdg" event={"ID":"0161a078-8da8-4080-bd86-1f8adfd0b57c","Type":"ContainerStarted","Data":"933e4415816dbf10cf8f3c3fe9658c7636764303a253e905a19e9de76ddb9de9"} Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.457780 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9x7g5" event={"ID":"ac5a82c2-3734-4064-bb05-2cf40dededee","Type":"ContainerStarted","Data":"799d0220500b3e0d1dc20c73b32ddaa31016d97d3349fd652b60f380b7970734"} Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.460423 4906 generic.go:334] "Generic (PLEG): container finished" podID="2b962aff-3926-4d40-b95f-ea1c8062ede2" containerID="baa36ca52f3ef352457161870100fc49e651f4e3e744f0bf9884ff9df26fc073" exitCode=0 Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.460481 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zhb6" event={"ID":"2b962aff-3926-4d40-b95f-ea1c8062ede2","Type":"ContainerDied","Data":"baa36ca52f3ef352457161870100fc49e651f4e3e744f0bf9884ff9df26fc073"} Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.804441 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zr8fw"] Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.806312 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zr8fw" Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.809862 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.813919 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zr8fw"] Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.842238 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6467f0-f671-4388-9851-05416de6f4b1-catalog-content\") pod \"redhat-operators-zr8fw\" (UID: \"0c6467f0-f671-4388-9851-05416de6f4b1\") " pod="openshift-marketplace/redhat-operators-zr8fw" Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.842318 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmgjr\" (UniqueName: \"kubernetes.io/projected/0c6467f0-f671-4388-9851-05416de6f4b1-kube-api-access-hmgjr\") pod \"redhat-operators-zr8fw\" (UID: \"0c6467f0-f671-4388-9851-05416de6f4b1\") " pod="openshift-marketplace/redhat-operators-zr8fw" Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.842353 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6467f0-f671-4388-9851-05416de6f4b1-utilities\") pod \"redhat-operators-zr8fw\" (UID: \"0c6467f0-f671-4388-9851-05416de6f4b1\") " pod="openshift-marketplace/redhat-operators-zr8fw" Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.943208 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6467f0-f671-4388-9851-05416de6f4b1-catalog-content\") pod \"redhat-operators-zr8fw\" (UID: \"0c6467f0-f671-4388-9851-05416de6f4b1\") " pod="openshift-marketplace/redhat-operators-zr8fw" Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.943296 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmgjr\" (UniqueName: \"kubernetes.io/projected/0c6467f0-f671-4388-9851-05416de6f4b1-kube-api-access-hmgjr\") pod \"redhat-operators-zr8fw\" (UID: \"0c6467f0-f671-4388-9851-05416de6f4b1\") " pod="openshift-marketplace/redhat-operators-zr8fw" Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.943325 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6467f0-f671-4388-9851-05416de6f4b1-utilities\") pod \"redhat-operators-zr8fw\" (UID: \"0c6467f0-f671-4388-9851-05416de6f4b1\") " pod="openshift-marketplace/redhat-operators-zr8fw" Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.944317 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6467f0-f671-4388-9851-05416de6f4b1-utilities\") pod \"redhat-operators-zr8fw\" (UID: \"0c6467f0-f671-4388-9851-05416de6f4b1\") " pod="openshift-marketplace/redhat-operators-zr8fw" Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.945360 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6467f0-f671-4388-9851-05416de6f4b1-catalog-content\") pod \"redhat-operators-zr8fw\" (UID: \"0c6467f0-f671-4388-9851-05416de6f4b1\") " pod="openshift-marketplace/redhat-operators-zr8fw" Mar 10 00:12:50 crc kubenswrapper[4906]: I0310 00:12:50.966431 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmgjr\" (UniqueName: \"kubernetes.io/projected/0c6467f0-f671-4388-9851-05416de6f4b1-kube-api-access-hmgjr\") pod \"redhat-operators-zr8fw\" (UID: \"0c6467f0-f671-4388-9851-05416de6f4b1\") " pod="openshift-marketplace/redhat-operators-zr8fw" Mar 10 00:12:51 crc kubenswrapper[4906]: I0310 00:12:51.123553 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zr8fw" Mar 10 00:12:51 crc kubenswrapper[4906]: I0310 00:12:51.343221 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zr8fw"] Mar 10 00:12:51 crc kubenswrapper[4906]: W0310 00:12:51.355446 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c6467f0_f671_4388_9851_05416de6f4b1.slice/crio-b10a7cd6659d8efdcc6c44f3b3c02e66d7da1203e6c197819e8e8a3ec4795b6b WatchSource:0}: Error finding container b10a7cd6659d8efdcc6c44f3b3c02e66d7da1203e6c197819e8e8a3ec4795b6b: Status 404 returned error can't find the container with id b10a7cd6659d8efdcc6c44f3b3c02e66d7da1203e6c197819e8e8a3ec4795b6b Mar 10 00:12:51 crc kubenswrapper[4906]: I0310 00:12:51.467998 4906 generic.go:334] "Generic (PLEG): container finished" podID="ac5a82c2-3734-4064-bb05-2cf40dededee" containerID="799d0220500b3e0d1dc20c73b32ddaa31016d97d3349fd652b60f380b7970734" exitCode=0 Mar 10 00:12:51 crc kubenswrapper[4906]: I0310 00:12:51.468059 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9x7g5" event={"ID":"ac5a82c2-3734-4064-bb05-2cf40dededee","Type":"ContainerDied","Data":"799d0220500b3e0d1dc20c73b32ddaa31016d97d3349fd652b60f380b7970734"} Mar 10 00:12:51 crc kubenswrapper[4906]: I0310 00:12:51.472159 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zr8fw" event={"ID":"0c6467f0-f671-4388-9851-05416de6f4b1","Type":"ContainerStarted","Data":"b10a7cd6659d8efdcc6c44f3b3c02e66d7da1203e6c197819e8e8a3ec4795b6b"} Mar 10 00:12:51 crc kubenswrapper[4906]: I0310 00:12:51.479307 4906 generic.go:334] "Generic (PLEG): container finished" podID="0161a078-8da8-4080-bd86-1f8adfd0b57c" containerID="6d048c1b1fb5e5cc9168f47665f16891a77c2179593269f5d368dc895740a4a1" exitCode=0 Mar 10 00:12:51 crc kubenswrapper[4906]: I0310 00:12:51.479352 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cfdg" event={"ID":"0161a078-8da8-4080-bd86-1f8adfd0b57c","Type":"ContainerDied","Data":"6d048c1b1fb5e5cc9168f47665f16891a77c2179593269f5d368dc895740a4a1"} Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.487607 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9x7g5" event={"ID":"ac5a82c2-3734-4064-bb05-2cf40dededee","Type":"ContainerStarted","Data":"ade01b502487ba0913b75ada5b42a47c26d18923d455515d6dd7ba1aa9e07c27"} Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.489262 4906 generic.go:334] "Generic (PLEG): container finished" podID="0c6467f0-f671-4388-9851-05416de6f4b1" containerID="ef04585c0ae9b6ecb664a91d5633482a4d46a7c4040843a02e9679eebaf7c26f" exitCode=0 Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.489917 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zr8fw" event={"ID":"0c6467f0-f671-4388-9851-05416de6f4b1","Type":"ContainerDied","Data":"ef04585c0ae9b6ecb664a91d5633482a4d46a7c4040843a02e9679eebaf7c26f"} Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.493045 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zhb6" event={"ID":"2b962aff-3926-4d40-b95f-ea1c8062ede2","Type":"ContainerStarted","Data":"bca5115bb23fdde05a42dad8373d89ea016a08770d3686c0eab1ba64f3e7fc94"} Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.499377 4906 generic.go:334] "Generic (PLEG): container finished" podID="0161a078-8da8-4080-bd86-1f8adfd0b57c" containerID="e3f7359e9f207f873fcec10743ba6cdbf24703c42302ab2dbd7e6039f81062bd" exitCode=0 Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.499417 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cfdg" event={"ID":"0161a078-8da8-4080-bd86-1f8adfd0b57c","Type":"ContainerDied","Data":"e3f7359e9f207f873fcec10743ba6cdbf24703c42302ab2dbd7e6039f81062bd"} Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.537291 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9x7g5" podStartSLOduration=2.970304216 podStartE2EDuration="5.537261581s" podCreationTimestamp="2026-03-10 00:12:47 +0000 UTC" firstStartedPulling="2026-03-10 00:12:49.446915045 +0000 UTC m=+395.594810167" lastFinishedPulling="2026-03-10 00:12:52.01387242 +0000 UTC m=+398.161767532" observedRunningTime="2026-03-10 00:12:52.512197003 +0000 UTC m=+398.660092105" watchObservedRunningTime="2026-03-10 00:12:52.537261581 +0000 UTC m=+398.685156693" Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.588739 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2zhb6" podStartSLOduration=2.723369257 podStartE2EDuration="5.588714736s" podCreationTimestamp="2026-03-10 00:12:47 +0000 UTC" firstStartedPulling="2026-03-10 00:12:48.439164694 +0000 UTC m=+394.587059806" lastFinishedPulling="2026-03-10 00:12:51.304510173 +0000 UTC m=+397.452405285" observedRunningTime="2026-03-10 00:12:52.586247676 +0000 UTC m=+398.734142818" watchObservedRunningTime="2026-03-10 00:12:52.588714736 +0000 UTC m=+398.736609848" Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.783544 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-qrjdl"] Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.784357 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.800291 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-qrjdl"] Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.975569 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f2fddd3-37d8-4373-8324-4c0effebe3cd-installation-pull-secrets\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.975657 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z492t\" (UniqueName: \"kubernetes.io/projected/8f2fddd3-37d8-4373-8324-4c0effebe3cd-kube-api-access-z492t\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.975757 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f2fddd3-37d8-4373-8324-4c0effebe3cd-ca-trust-extracted\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.975827 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.975855 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f2fddd3-37d8-4373-8324-4c0effebe3cd-registry-tls\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.975880 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f2fddd3-37d8-4373-8324-4c0effebe3cd-registry-certificates\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.975944 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f2fddd3-37d8-4373-8324-4c0effebe3cd-trusted-ca\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:52 crc kubenswrapper[4906]: I0310 00:12:52.975996 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f2fddd3-37d8-4373-8324-4c0effebe3cd-bound-sa-token\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.013888 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.076937 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z492t\" (UniqueName: \"kubernetes.io/projected/8f2fddd3-37d8-4373-8324-4c0effebe3cd-kube-api-access-z492t\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.076978 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f2fddd3-37d8-4373-8324-4c0effebe3cd-ca-trust-extracted\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.077003 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f2fddd3-37d8-4373-8324-4c0effebe3cd-registry-tls\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.077025 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f2fddd3-37d8-4373-8324-4c0effebe3cd-registry-certificates\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.077054 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f2fddd3-37d8-4373-8324-4c0effebe3cd-trusted-ca\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.077072 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f2fddd3-37d8-4373-8324-4c0effebe3cd-bound-sa-token\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.077099 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f2fddd3-37d8-4373-8324-4c0effebe3cd-installation-pull-secrets\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.077709 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f2fddd3-37d8-4373-8324-4c0effebe3cd-ca-trust-extracted\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.078732 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f2fddd3-37d8-4373-8324-4c0effebe3cd-trusted-ca\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.078851 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f2fddd3-37d8-4373-8324-4c0effebe3cd-registry-certificates\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.091438 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f2fddd3-37d8-4373-8324-4c0effebe3cd-installation-pull-secrets\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.094504 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z492t\" (UniqueName: \"kubernetes.io/projected/8f2fddd3-37d8-4373-8324-4c0effebe3cd-kube-api-access-z492t\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.099167 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f2fddd3-37d8-4373-8324-4c0effebe3cd-registry-tls\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.101209 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f2fddd3-37d8-4373-8324-4c0effebe3cd-bound-sa-token\") pod \"image-registry-66df7c8f76-qrjdl\" (UID: \"8f2fddd3-37d8-4373-8324-4c0effebe3cd\") " pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.397832 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.511975 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zr8fw" event={"ID":"0c6467f0-f671-4388-9851-05416de6f4b1","Type":"ContainerStarted","Data":"dfb7e1aec7158e8f42b503ba782130c981e354402a4b978659c4d24a852e85b2"} Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.515145 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cfdg" event={"ID":"0161a078-8da8-4080-bd86-1f8adfd0b57c","Type":"ContainerStarted","Data":"d7c471fe26d5a03baf3db8c54f5cf5add3436b1e9e3d3ce8798101f4493e513b"} Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.600480 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4cfdg" podStartSLOduration=3.180038467 podStartE2EDuration="4.600441478s" podCreationTimestamp="2026-03-10 00:12:49 +0000 UTC" firstStartedPulling="2026-03-10 00:12:51.480700743 +0000 UTC m=+397.628595865" lastFinishedPulling="2026-03-10 00:12:52.901103764 +0000 UTC m=+399.048998876" observedRunningTime="2026-03-10 00:12:53.558059401 +0000 UTC m=+399.705954513" watchObservedRunningTime="2026-03-10 00:12:53.600441478 +0000 UTC m=+399.748336610" Mar 10 00:12:53 crc kubenswrapper[4906]: I0310 00:12:53.604331 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-qrjdl"] Mar 10 00:12:53 crc kubenswrapper[4906]: W0310 00:12:53.614573 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f2fddd3_37d8_4373_8324_4c0effebe3cd.slice/crio-ba022fd6c4217f1f33ccf54fff5c2ec04d9983a29cee8f53b36d12de65034b6d WatchSource:0}: Error finding container ba022fd6c4217f1f33ccf54fff5c2ec04d9983a29cee8f53b36d12de65034b6d: Status 404 returned error can't find the container with id ba022fd6c4217f1f33ccf54fff5c2ec04d9983a29cee8f53b36d12de65034b6d Mar 10 00:12:54 crc kubenswrapper[4906]: I0310 00:12:54.527222 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" event={"ID":"8f2fddd3-37d8-4373-8324-4c0effebe3cd","Type":"ContainerStarted","Data":"5cb8c9a3b286e52d194f0d3e75aa60fb60231bf2936b024542dc2f7b0f80abf2"} Mar 10 00:12:54 crc kubenswrapper[4906]: I0310 00:12:54.527269 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" event={"ID":"8f2fddd3-37d8-4373-8324-4c0effebe3cd","Type":"ContainerStarted","Data":"ba022fd6c4217f1f33ccf54fff5c2ec04d9983a29cee8f53b36d12de65034b6d"} Mar 10 00:12:54 crc kubenswrapper[4906]: I0310 00:12:54.527347 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:12:54 crc kubenswrapper[4906]: I0310 00:12:54.530388 4906 generic.go:334] "Generic (PLEG): container finished" podID="0c6467f0-f671-4388-9851-05416de6f4b1" containerID="dfb7e1aec7158e8f42b503ba782130c981e354402a4b978659c4d24a852e85b2" exitCode=0 Mar 10 00:12:54 crc kubenswrapper[4906]: I0310 00:12:54.530548 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zr8fw" event={"ID":"0c6467f0-f671-4388-9851-05416de6f4b1","Type":"ContainerDied","Data":"dfb7e1aec7158e8f42b503ba782130c981e354402a4b978659c4d24a852e85b2"} Mar 10 00:12:54 crc kubenswrapper[4906]: I0310 00:12:54.553596 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" podStartSLOduration=2.553575265 podStartE2EDuration="2.553575265s" podCreationTimestamp="2026-03-10 00:12:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:12:54.54915215 +0000 UTC m=+400.697047282" watchObservedRunningTime="2026-03-10 00:12:54.553575265 +0000 UTC m=+400.701470367" Mar 10 00:12:55 crc kubenswrapper[4906]: I0310 00:12:55.538847 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zr8fw" event={"ID":"0c6467f0-f671-4388-9851-05416de6f4b1","Type":"ContainerStarted","Data":"31ec763a634fa1600e139aeaca4e6c12254d1e0baa49241449367c1d51e1d2e8"} Mar 10 00:12:57 crc kubenswrapper[4906]: I0310 00:12:57.722538 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2zhb6" Mar 10 00:12:57 crc kubenswrapper[4906]: I0310 00:12:57.724362 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2zhb6" Mar 10 00:12:57 crc kubenswrapper[4906]: I0310 00:12:57.794025 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2zhb6" Mar 10 00:12:57 crc kubenswrapper[4906]: I0310 00:12:57.833315 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zr8fw" podStartSLOduration=5.350243779 podStartE2EDuration="7.833288314s" podCreationTimestamp="2026-03-10 00:12:50 +0000 UTC" firstStartedPulling="2026-03-10 00:12:52.490556061 +0000 UTC m=+398.638451173" lastFinishedPulling="2026-03-10 00:12:54.973600596 +0000 UTC m=+401.121495708" observedRunningTime="2026-03-10 00:12:55.559576875 +0000 UTC m=+401.707472017" watchObservedRunningTime="2026-03-10 00:12:57.833288314 +0000 UTC m=+403.981183436" Mar 10 00:12:58 crc kubenswrapper[4906]: I0310 00:12:58.392374 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9x7g5" Mar 10 00:12:58 crc kubenswrapper[4906]: I0310 00:12:58.393078 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9x7g5" Mar 10 00:12:58 crc kubenswrapper[4906]: I0310 00:12:58.461161 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9x7g5" Mar 10 00:12:58 crc kubenswrapper[4906]: I0310 00:12:58.622704 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2zhb6" Mar 10 00:12:58 crc kubenswrapper[4906]: I0310 00:12:58.636234 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9x7g5" Mar 10 00:13:00 crc kubenswrapper[4906]: I0310 00:13:00.132169 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4cfdg" Mar 10 00:13:00 crc kubenswrapper[4906]: I0310 00:13:00.132274 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4cfdg" Mar 10 00:13:00 crc kubenswrapper[4906]: I0310 00:13:00.209936 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4cfdg" Mar 10 00:13:00 crc kubenswrapper[4906]: I0310 00:13:00.502366 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:13:00 crc kubenswrapper[4906]: I0310 00:13:00.502473 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:13:00 crc kubenswrapper[4906]: I0310 00:13:00.653030 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4cfdg" Mar 10 00:13:01 crc kubenswrapper[4906]: I0310 00:13:01.124575 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zr8fw" Mar 10 00:13:01 crc kubenswrapper[4906]: I0310 00:13:01.124958 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zr8fw" Mar 10 00:13:02 crc kubenswrapper[4906]: I0310 00:13:02.181675 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zr8fw" podUID="0c6467f0-f671-4388-9851-05416de6f4b1" containerName="registry-server" probeResult="failure" output=< Mar 10 00:13:02 crc kubenswrapper[4906]: timeout: failed to connect service ":50051" within 1s Mar 10 00:13:02 crc kubenswrapper[4906]: > Mar 10 00:13:11 crc kubenswrapper[4906]: I0310 00:13:11.170598 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zr8fw" Mar 10 00:13:11 crc kubenswrapper[4906]: I0310 00:13:11.215328 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zr8fw" Mar 10 00:13:13 crc kubenswrapper[4906]: I0310 00:13:13.407475 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-qrjdl" Mar 10 00:13:13 crc kubenswrapper[4906]: I0310 00:13:13.495672 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-66jxp"] Mar 10 00:13:30 crc kubenswrapper[4906]: I0310 00:13:30.502616 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:13:30 crc kubenswrapper[4906]: I0310 00:13:30.503236 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:13:38 crc kubenswrapper[4906]: I0310 00:13:38.536191 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" podUID="47ee6fa1-0ef0-414f-91af-0f170e94c390" containerName="registry" containerID="cri-o://10aee4e6e619472c94b77cc63290d24e555fb483641c7dcd4c04a02eb2112dd6" gracePeriod=30 Mar 10 00:13:38 crc kubenswrapper[4906]: I0310 00:13:38.916699 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:13:38 crc kubenswrapper[4906]: I0310 00:13:38.932318 4906 generic.go:334] "Generic (PLEG): container finished" podID="47ee6fa1-0ef0-414f-91af-0f170e94c390" containerID="10aee4e6e619472c94b77cc63290d24e555fb483641c7dcd4c04a02eb2112dd6" exitCode=0 Mar 10 00:13:38 crc kubenswrapper[4906]: I0310 00:13:38.932364 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" Mar 10 00:13:38 crc kubenswrapper[4906]: I0310 00:13:38.932400 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" event={"ID":"47ee6fa1-0ef0-414f-91af-0f170e94c390","Type":"ContainerDied","Data":"10aee4e6e619472c94b77cc63290d24e555fb483641c7dcd4c04a02eb2112dd6"} Mar 10 00:13:38 crc kubenswrapper[4906]: I0310 00:13:38.932471 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-66jxp" event={"ID":"47ee6fa1-0ef0-414f-91af-0f170e94c390","Type":"ContainerDied","Data":"f3e975c0539acc30f2d16ea72bcdba40ba957693aae0b95d8dab10c83f2fbdaa"} Mar 10 00:13:38 crc kubenswrapper[4906]: I0310 00:13:38.932505 4906 scope.go:117] "RemoveContainer" containerID="10aee4e6e619472c94b77cc63290d24e555fb483641c7dcd4c04a02eb2112dd6" Mar 10 00:13:38 crc kubenswrapper[4906]: I0310 00:13:38.952752 4906 scope.go:117] "RemoveContainer" containerID="10aee4e6e619472c94b77cc63290d24e555fb483641c7dcd4c04a02eb2112dd6" Mar 10 00:13:38 crc kubenswrapper[4906]: E0310 00:13:38.953443 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10aee4e6e619472c94b77cc63290d24e555fb483641c7dcd4c04a02eb2112dd6\": container with ID starting with 10aee4e6e619472c94b77cc63290d24e555fb483641c7dcd4c04a02eb2112dd6 not found: ID does not exist" containerID="10aee4e6e619472c94b77cc63290d24e555fb483641c7dcd4c04a02eb2112dd6" Mar 10 00:13:38 crc kubenswrapper[4906]: I0310 00:13:38.953499 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10aee4e6e619472c94b77cc63290d24e555fb483641c7dcd4c04a02eb2112dd6"} err="failed to get container status \"10aee4e6e619472c94b77cc63290d24e555fb483641c7dcd4c04a02eb2112dd6\": rpc error: code = NotFound desc = could not find container \"10aee4e6e619472c94b77cc63290d24e555fb483641c7dcd4c04a02eb2112dd6\": container with ID starting with 10aee4e6e619472c94b77cc63290d24e555fb483641c7dcd4c04a02eb2112dd6 not found: ID does not exist" Mar 10 00:13:38 crc kubenswrapper[4906]: I0310 00:13:38.979373 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-bound-sa-token\") pod \"47ee6fa1-0ef0-414f-91af-0f170e94c390\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " Mar 10 00:13:38 crc kubenswrapper[4906]: I0310 00:13:38.979421 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/47ee6fa1-0ef0-414f-91af-0f170e94c390-registry-certificates\") pod \"47ee6fa1-0ef0-414f-91af-0f170e94c390\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " Mar 10 00:13:38 crc kubenswrapper[4906]: I0310 00:13:38.979445 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfkfp\" (UniqueName: \"kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-kube-api-access-rfkfp\") pod \"47ee6fa1-0ef0-414f-91af-0f170e94c390\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " Mar 10 00:13:38 crc kubenswrapper[4906]: I0310 00:13:38.981577 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47ee6fa1-0ef0-414f-91af-0f170e94c390-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "47ee6fa1-0ef0-414f-91af-0f170e94c390" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:13:38 crc kubenswrapper[4906]: I0310 00:13:38.990050 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "47ee6fa1-0ef0-414f-91af-0f170e94c390" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:38 crc kubenswrapper[4906]: I0310 00:13:38.990124 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-kube-api-access-rfkfp" (OuterVolumeSpecName: "kube-api-access-rfkfp") pod "47ee6fa1-0ef0-414f-91af-0f170e94c390" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390"). InnerVolumeSpecName "kube-api-access-rfkfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.080890 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"47ee6fa1-0ef0-414f-91af-0f170e94c390\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.081020 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/47ee6fa1-0ef0-414f-91af-0f170e94c390-installation-pull-secrets\") pod \"47ee6fa1-0ef0-414f-91af-0f170e94c390\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.081070 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/47ee6fa1-0ef0-414f-91af-0f170e94c390-ca-trust-extracted\") pod \"47ee6fa1-0ef0-414f-91af-0f170e94c390\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.081114 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47ee6fa1-0ef0-414f-91af-0f170e94c390-trusted-ca\") pod \"47ee6fa1-0ef0-414f-91af-0f170e94c390\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.081150 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-registry-tls\") pod \"47ee6fa1-0ef0-414f-91af-0f170e94c390\" (UID: \"47ee6fa1-0ef0-414f-91af-0f170e94c390\") " Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.081374 4906 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.081392 4906 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/47ee6fa1-0ef0-414f-91af-0f170e94c390-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.081403 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfkfp\" (UniqueName: \"kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-kube-api-access-rfkfp\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.081978 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47ee6fa1-0ef0-414f-91af-0f170e94c390-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "47ee6fa1-0ef0-414f-91af-0f170e94c390" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.087878 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "47ee6fa1-0ef0-414f-91af-0f170e94c390" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.087953 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47ee6fa1-0ef0-414f-91af-0f170e94c390-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "47ee6fa1-0ef0-414f-91af-0f170e94c390" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.093238 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "47ee6fa1-0ef0-414f-91af-0f170e94c390" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.101167 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47ee6fa1-0ef0-414f-91af-0f170e94c390-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "47ee6fa1-0ef0-414f-91af-0f170e94c390" (UID: "47ee6fa1-0ef0-414f-91af-0f170e94c390"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.182755 4906 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/47ee6fa1-0ef0-414f-91af-0f170e94c390-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.182786 4906 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/47ee6fa1-0ef0-414f-91af-0f170e94c390-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.182797 4906 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47ee6fa1-0ef0-414f-91af-0f170e94c390-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.182805 4906 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/47ee6fa1-0ef0-414f-91af-0f170e94c390-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.285793 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-66jxp"] Mar 10 00:13:39 crc kubenswrapper[4906]: I0310 00:13:39.295019 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-66jxp"] Mar 10 00:13:39 crc kubenswrapper[4906]: E0310 00:13:39.369925 4906 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47ee6fa1_0ef0_414f_91af_0f170e94c390.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47ee6fa1_0ef0_414f_91af_0f170e94c390.slice/crio-f3e975c0539acc30f2d16ea72bcdba40ba957693aae0b95d8dab10c83f2fbdaa\": RecentStats: unable to find data in memory cache]" Mar 10 00:13:40 crc kubenswrapper[4906]: I0310 00:13:40.591229 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47ee6fa1-0ef0-414f-91af-0f170e94c390" path="/var/lib/kubelet/pods/47ee6fa1-0ef0-414f-91af-0f170e94c390/volumes" Mar 10 00:14:00 crc kubenswrapper[4906]: I0310 00:14:00.150967 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551694-vvnsl"] Mar 10 00:14:00 crc kubenswrapper[4906]: E0310 00:14:00.153802 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ee6fa1-0ef0-414f-91af-0f170e94c390" containerName="registry" Mar 10 00:14:00 crc kubenswrapper[4906]: I0310 00:14:00.153831 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ee6fa1-0ef0-414f-91af-0f170e94c390" containerName="registry" Mar 10 00:14:00 crc kubenswrapper[4906]: I0310 00:14:00.154052 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="47ee6fa1-0ef0-414f-91af-0f170e94c390" containerName="registry" Mar 10 00:14:00 crc kubenswrapper[4906]: I0310 00:14:00.154835 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551694-vvnsl" Mar 10 00:14:00 crc kubenswrapper[4906]: I0310 00:14:00.158564 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:14:00 crc kubenswrapper[4906]: I0310 00:14:00.158728 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:14:00 crc kubenswrapper[4906]: I0310 00:14:00.159004 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ktdr6" Mar 10 00:14:00 crc kubenswrapper[4906]: I0310 00:14:00.167246 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551694-vvnsl"] Mar 10 00:14:00 crc kubenswrapper[4906]: I0310 00:14:00.215300 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xcnb\" (UniqueName: \"kubernetes.io/projected/8f53421f-d498-4bfa-8043-678e1083105e-kube-api-access-5xcnb\") pod \"auto-csr-approver-29551694-vvnsl\" (UID: \"8f53421f-d498-4bfa-8043-678e1083105e\") " pod="openshift-infra/auto-csr-approver-29551694-vvnsl" Mar 10 00:14:00 crc kubenswrapper[4906]: I0310 00:14:00.316128 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xcnb\" (UniqueName: \"kubernetes.io/projected/8f53421f-d498-4bfa-8043-678e1083105e-kube-api-access-5xcnb\") pod \"auto-csr-approver-29551694-vvnsl\" (UID: \"8f53421f-d498-4bfa-8043-678e1083105e\") " pod="openshift-infra/auto-csr-approver-29551694-vvnsl" Mar 10 00:14:00 crc kubenswrapper[4906]: I0310 00:14:00.340435 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xcnb\" (UniqueName: \"kubernetes.io/projected/8f53421f-d498-4bfa-8043-678e1083105e-kube-api-access-5xcnb\") pod \"auto-csr-approver-29551694-vvnsl\" (UID: \"8f53421f-d498-4bfa-8043-678e1083105e\") " pod="openshift-infra/auto-csr-approver-29551694-vvnsl" Mar 10 00:14:00 crc kubenswrapper[4906]: I0310 00:14:00.495545 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551694-vvnsl" Mar 10 00:14:00 crc kubenswrapper[4906]: I0310 00:14:00.503142 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:14:00 crc kubenswrapper[4906]: I0310 00:14:00.503238 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:14:00 crc kubenswrapper[4906]: I0310 00:14:00.503307 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:14:00 crc kubenswrapper[4906]: I0310 00:14:00.504583 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8e93da60220c7a28ef01def4f9a5029323cddf710d54a9d14b770a8f7137b36e"} pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:14:00 crc kubenswrapper[4906]: I0310 00:14:00.504778 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" containerID="cri-o://8e93da60220c7a28ef01def4f9a5029323cddf710d54a9d14b770a8f7137b36e" gracePeriod=600 Mar 10 00:14:01 crc kubenswrapper[4906]: I0310 00:14:00.763800 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551694-vvnsl"] Mar 10 00:14:01 crc kubenswrapper[4906]: W0310 00:14:00.776514 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f53421f_d498_4bfa_8043_678e1083105e.slice/crio-07f9ee66407d48ab665744f7e144e256b37ca16b8fef60588715f4f35a36b150 WatchSource:0}: Error finding container 07f9ee66407d48ab665744f7e144e256b37ca16b8fef60588715f4f35a36b150: Status 404 returned error can't find the container with id 07f9ee66407d48ab665744f7e144e256b37ca16b8fef60588715f4f35a36b150 Mar 10 00:14:01 crc kubenswrapper[4906]: I0310 00:14:01.095652 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551694-vvnsl" event={"ID":"8f53421f-d498-4bfa-8043-678e1083105e","Type":"ContainerStarted","Data":"07f9ee66407d48ab665744f7e144e256b37ca16b8fef60588715f4f35a36b150"} Mar 10 00:14:01 crc kubenswrapper[4906]: I0310 00:14:01.101185 4906 generic.go:334] "Generic (PLEG): container finished" podID="72d61d35-0a64-45a5-8df3-9c429727deba" containerID="8e93da60220c7a28ef01def4f9a5029323cddf710d54a9d14b770a8f7137b36e" exitCode=0 Mar 10 00:14:01 crc kubenswrapper[4906]: I0310 00:14:01.101229 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerDied","Data":"8e93da60220c7a28ef01def4f9a5029323cddf710d54a9d14b770a8f7137b36e"} Mar 10 00:14:01 crc kubenswrapper[4906]: I0310 00:14:01.101265 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerStarted","Data":"ff7670643966cdfab75a0c844ba14d4d3b7f4816f0572260c65b9eddbbe62eaa"} Mar 10 00:14:01 crc kubenswrapper[4906]: I0310 00:14:01.101286 4906 scope.go:117] "RemoveContainer" containerID="f53c021f85bbd1f77bb00169203a7ee8629e31f3fab47b40dc594d7995d4b82a" Mar 10 00:14:02 crc kubenswrapper[4906]: I0310 00:14:02.112230 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551694-vvnsl" event={"ID":"8f53421f-d498-4bfa-8043-678e1083105e","Type":"ContainerStarted","Data":"81e618ed887c40423b6e3116ed9cea0c7ae9a0fad654dc75a1cb225744f296ad"} Mar 10 00:14:03 crc kubenswrapper[4906]: I0310 00:14:03.126312 4906 generic.go:334] "Generic (PLEG): container finished" podID="8f53421f-d498-4bfa-8043-678e1083105e" containerID="81e618ed887c40423b6e3116ed9cea0c7ae9a0fad654dc75a1cb225744f296ad" exitCode=0 Mar 10 00:14:03 crc kubenswrapper[4906]: I0310 00:14:03.126430 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551694-vvnsl" event={"ID":"8f53421f-d498-4bfa-8043-678e1083105e","Type":"ContainerDied","Data":"81e618ed887c40423b6e3116ed9cea0c7ae9a0fad654dc75a1cb225744f296ad"} Mar 10 00:14:04 crc kubenswrapper[4906]: I0310 00:14:04.472484 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551694-vvnsl" Mar 10 00:14:04 crc kubenswrapper[4906]: I0310 00:14:04.608112 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xcnb\" (UniqueName: \"kubernetes.io/projected/8f53421f-d498-4bfa-8043-678e1083105e-kube-api-access-5xcnb\") pod \"8f53421f-d498-4bfa-8043-678e1083105e\" (UID: \"8f53421f-d498-4bfa-8043-678e1083105e\") " Mar 10 00:14:04 crc kubenswrapper[4906]: I0310 00:14:04.617661 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f53421f-d498-4bfa-8043-678e1083105e-kube-api-access-5xcnb" (OuterVolumeSpecName: "kube-api-access-5xcnb") pod "8f53421f-d498-4bfa-8043-678e1083105e" (UID: "8f53421f-d498-4bfa-8043-678e1083105e"). InnerVolumeSpecName "kube-api-access-5xcnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:14:04 crc kubenswrapper[4906]: I0310 00:14:04.711307 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xcnb\" (UniqueName: \"kubernetes.io/projected/8f53421f-d498-4bfa-8043-678e1083105e-kube-api-access-5xcnb\") on node \"crc\" DevicePath \"\"" Mar 10 00:14:05 crc kubenswrapper[4906]: I0310 00:14:05.147543 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551694-vvnsl" event={"ID":"8f53421f-d498-4bfa-8043-678e1083105e","Type":"ContainerDied","Data":"07f9ee66407d48ab665744f7e144e256b37ca16b8fef60588715f4f35a36b150"} Mar 10 00:14:05 crc kubenswrapper[4906]: I0310 00:14:05.147724 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07f9ee66407d48ab665744f7e144e256b37ca16b8fef60588715f4f35a36b150" Mar 10 00:14:05 crc kubenswrapper[4906]: I0310 00:14:05.147731 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551694-vvnsl" Mar 10 00:14:05 crc kubenswrapper[4906]: I0310 00:14:05.230396 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551688-fkkqj"] Mar 10 00:14:05 crc kubenswrapper[4906]: I0310 00:14:05.238238 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551688-fkkqj"] Mar 10 00:14:06 crc kubenswrapper[4906]: I0310 00:14:06.590912 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="094c6270-b610-42c0-a6ce-3c146cb6bb6c" path="/var/lib/kubelet/pods/094c6270-b610-42c0-a6ce-3c146cb6bb6c/volumes" Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.134206 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c"] Mar 10 00:15:00 crc kubenswrapper[4906]: E0310 00:15:00.135404 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f53421f-d498-4bfa-8043-678e1083105e" containerName="oc" Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.135422 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f53421f-d498-4bfa-8043-678e1083105e" containerName="oc" Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.135552 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f53421f-d498-4bfa-8043-678e1083105e" containerName="oc" Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.136033 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c" Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.139034 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c"] Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.140015 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.140230 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.304750 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlndv\" (UniqueName: \"kubernetes.io/projected/1edea7ea-a95b-4b23-931b-2a9929ea36cf-kube-api-access-xlndv\") pod \"collect-profiles-29551695-klw7c\" (UID: \"1edea7ea-a95b-4b23-931b-2a9929ea36cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c" Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.304812 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1edea7ea-a95b-4b23-931b-2a9929ea36cf-config-volume\") pod \"collect-profiles-29551695-klw7c\" (UID: \"1edea7ea-a95b-4b23-931b-2a9929ea36cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c" Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.304863 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1edea7ea-a95b-4b23-931b-2a9929ea36cf-secret-volume\") pod \"collect-profiles-29551695-klw7c\" (UID: \"1edea7ea-a95b-4b23-931b-2a9929ea36cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c" Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.405990 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlndv\" (UniqueName: \"kubernetes.io/projected/1edea7ea-a95b-4b23-931b-2a9929ea36cf-kube-api-access-xlndv\") pod \"collect-profiles-29551695-klw7c\" (UID: \"1edea7ea-a95b-4b23-931b-2a9929ea36cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c" Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.406369 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1edea7ea-a95b-4b23-931b-2a9929ea36cf-config-volume\") pod \"collect-profiles-29551695-klw7c\" (UID: \"1edea7ea-a95b-4b23-931b-2a9929ea36cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c" Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.406408 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1edea7ea-a95b-4b23-931b-2a9929ea36cf-secret-volume\") pod \"collect-profiles-29551695-klw7c\" (UID: \"1edea7ea-a95b-4b23-931b-2a9929ea36cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c" Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.407796 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1edea7ea-a95b-4b23-931b-2a9929ea36cf-config-volume\") pod \"collect-profiles-29551695-klw7c\" (UID: \"1edea7ea-a95b-4b23-931b-2a9929ea36cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c" Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.412764 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1edea7ea-a95b-4b23-931b-2a9929ea36cf-secret-volume\") pod \"collect-profiles-29551695-klw7c\" (UID: \"1edea7ea-a95b-4b23-931b-2a9929ea36cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c" Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.422960 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlndv\" (UniqueName: \"kubernetes.io/projected/1edea7ea-a95b-4b23-931b-2a9929ea36cf-kube-api-access-xlndv\") pod \"collect-profiles-29551695-klw7c\" (UID: \"1edea7ea-a95b-4b23-931b-2a9929ea36cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c" Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.460609 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c" Mar 10 00:15:00 crc kubenswrapper[4906]: I0310 00:15:00.851172 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c"] Mar 10 00:15:00 crc kubenswrapper[4906]: W0310 00:15:00.862441 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1edea7ea_a95b_4b23_931b_2a9929ea36cf.slice/crio-faf4cd9fb0b9f20c2999aa00d720ffbe296a54946fcbb4bcc89481edc82e8b8c WatchSource:0}: Error finding container faf4cd9fb0b9f20c2999aa00d720ffbe296a54946fcbb4bcc89481edc82e8b8c: Status 404 returned error can't find the container with id faf4cd9fb0b9f20c2999aa00d720ffbe296a54946fcbb4bcc89481edc82e8b8c Mar 10 00:15:01 crc kubenswrapper[4906]: I0310 00:15:01.642441 4906 generic.go:334] "Generic (PLEG): container finished" podID="1edea7ea-a95b-4b23-931b-2a9929ea36cf" containerID="bf726e8e8070b80270effae17ed6a8c2b4779e93ece3062484d06f7f29de9158" exitCode=0 Mar 10 00:15:01 crc kubenswrapper[4906]: I0310 00:15:01.642510 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c" event={"ID":"1edea7ea-a95b-4b23-931b-2a9929ea36cf","Type":"ContainerDied","Data":"bf726e8e8070b80270effae17ed6a8c2b4779e93ece3062484d06f7f29de9158"} Mar 10 00:15:01 crc kubenswrapper[4906]: I0310 00:15:01.642821 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c" event={"ID":"1edea7ea-a95b-4b23-931b-2a9929ea36cf","Type":"ContainerStarted","Data":"faf4cd9fb0b9f20c2999aa00d720ffbe296a54946fcbb4bcc89481edc82e8b8c"} Mar 10 00:15:02 crc kubenswrapper[4906]: I0310 00:15:02.874963 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c" Mar 10 00:15:02 crc kubenswrapper[4906]: I0310 00:15:02.945365 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlndv\" (UniqueName: \"kubernetes.io/projected/1edea7ea-a95b-4b23-931b-2a9929ea36cf-kube-api-access-xlndv\") pod \"1edea7ea-a95b-4b23-931b-2a9929ea36cf\" (UID: \"1edea7ea-a95b-4b23-931b-2a9929ea36cf\") " Mar 10 00:15:02 crc kubenswrapper[4906]: I0310 00:15:02.945731 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1edea7ea-a95b-4b23-931b-2a9929ea36cf-secret-volume\") pod \"1edea7ea-a95b-4b23-931b-2a9929ea36cf\" (UID: \"1edea7ea-a95b-4b23-931b-2a9929ea36cf\") " Mar 10 00:15:02 crc kubenswrapper[4906]: I0310 00:15:02.945870 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1edea7ea-a95b-4b23-931b-2a9929ea36cf-config-volume\") pod \"1edea7ea-a95b-4b23-931b-2a9929ea36cf\" (UID: \"1edea7ea-a95b-4b23-931b-2a9929ea36cf\") " Mar 10 00:15:02 crc kubenswrapper[4906]: I0310 00:15:02.947065 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1edea7ea-a95b-4b23-931b-2a9929ea36cf-config-volume" (OuterVolumeSpecName: "config-volume") pod "1edea7ea-a95b-4b23-931b-2a9929ea36cf" (UID: "1edea7ea-a95b-4b23-931b-2a9929ea36cf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:15:02 crc kubenswrapper[4906]: I0310 00:15:02.951815 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1edea7ea-a95b-4b23-931b-2a9929ea36cf-kube-api-access-xlndv" (OuterVolumeSpecName: "kube-api-access-xlndv") pod "1edea7ea-a95b-4b23-931b-2a9929ea36cf" (UID: "1edea7ea-a95b-4b23-931b-2a9929ea36cf"). InnerVolumeSpecName "kube-api-access-xlndv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:15:02 crc kubenswrapper[4906]: I0310 00:15:02.952543 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1edea7ea-a95b-4b23-931b-2a9929ea36cf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1edea7ea-a95b-4b23-931b-2a9929ea36cf" (UID: "1edea7ea-a95b-4b23-931b-2a9929ea36cf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:15:03 crc kubenswrapper[4906]: I0310 00:15:03.047472 4906 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1edea7ea-a95b-4b23-931b-2a9929ea36cf-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:03 crc kubenswrapper[4906]: I0310 00:15:03.047513 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlndv\" (UniqueName: \"kubernetes.io/projected/1edea7ea-a95b-4b23-931b-2a9929ea36cf-kube-api-access-xlndv\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:03 crc kubenswrapper[4906]: I0310 00:15:03.047528 4906 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1edea7ea-a95b-4b23-931b-2a9929ea36cf-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:15:03 crc kubenswrapper[4906]: I0310 00:15:03.655070 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c" event={"ID":"1edea7ea-a95b-4b23-931b-2a9929ea36cf","Type":"ContainerDied","Data":"faf4cd9fb0b9f20c2999aa00d720ffbe296a54946fcbb4bcc89481edc82e8b8c"} Mar 10 00:15:03 crc kubenswrapper[4906]: I0310 00:15:03.655108 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faf4cd9fb0b9f20c2999aa00d720ffbe296a54946fcbb4bcc89481edc82e8b8c" Mar 10 00:15:03 crc kubenswrapper[4906]: I0310 00:15:03.655202 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551695-klw7c" Mar 10 00:16:00 crc kubenswrapper[4906]: I0310 00:16:00.154622 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551696-9x59s"] Mar 10 00:16:00 crc kubenswrapper[4906]: E0310 00:16:00.156112 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1edea7ea-a95b-4b23-931b-2a9929ea36cf" containerName="collect-profiles" Mar 10 00:16:00 crc kubenswrapper[4906]: I0310 00:16:00.156137 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="1edea7ea-a95b-4b23-931b-2a9929ea36cf" containerName="collect-profiles" Mar 10 00:16:00 crc kubenswrapper[4906]: I0310 00:16:00.156326 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="1edea7ea-a95b-4b23-931b-2a9929ea36cf" containerName="collect-profiles" Mar 10 00:16:00 crc kubenswrapper[4906]: I0310 00:16:00.157037 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551696-9x59s" Mar 10 00:16:00 crc kubenswrapper[4906]: I0310 00:16:00.160436 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551696-9x59s"] Mar 10 00:16:00 crc kubenswrapper[4906]: I0310 00:16:00.200186 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:16:00 crc kubenswrapper[4906]: I0310 00:16:00.200493 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:16:00 crc kubenswrapper[4906]: I0310 00:16:00.200663 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ktdr6" Mar 10 00:16:00 crc kubenswrapper[4906]: I0310 00:16:00.224448 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8gpd\" (UniqueName: \"kubernetes.io/projected/20232528-5424-4988-b5c5-52011267a7e4-kube-api-access-z8gpd\") pod \"auto-csr-approver-29551696-9x59s\" (UID: \"20232528-5424-4988-b5c5-52011267a7e4\") " pod="openshift-infra/auto-csr-approver-29551696-9x59s" Mar 10 00:16:00 crc kubenswrapper[4906]: I0310 00:16:00.325696 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8gpd\" (UniqueName: \"kubernetes.io/projected/20232528-5424-4988-b5c5-52011267a7e4-kube-api-access-z8gpd\") pod \"auto-csr-approver-29551696-9x59s\" (UID: \"20232528-5424-4988-b5c5-52011267a7e4\") " pod="openshift-infra/auto-csr-approver-29551696-9x59s" Mar 10 00:16:00 crc kubenswrapper[4906]: I0310 00:16:00.365969 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8gpd\" (UniqueName: \"kubernetes.io/projected/20232528-5424-4988-b5c5-52011267a7e4-kube-api-access-z8gpd\") pod \"auto-csr-approver-29551696-9x59s\" (UID: \"20232528-5424-4988-b5c5-52011267a7e4\") " pod="openshift-infra/auto-csr-approver-29551696-9x59s" Mar 10 00:16:00 crc kubenswrapper[4906]: I0310 00:16:00.502402 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:16:00 crc kubenswrapper[4906]: I0310 00:16:00.502501 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:16:00 crc kubenswrapper[4906]: I0310 00:16:00.514198 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551696-9x59s" Mar 10 00:16:00 crc kubenswrapper[4906]: I0310 00:16:00.827491 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551696-9x59s"] Mar 10 00:16:00 crc kubenswrapper[4906]: I0310 00:16:00.838747 4906 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:16:01 crc kubenswrapper[4906]: I0310 00:16:01.118176 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551696-9x59s" event={"ID":"20232528-5424-4988-b5c5-52011267a7e4","Type":"ContainerStarted","Data":"337eb8f7153aee9f74df13d93b2f442988799932f036285a885a06632e3bba1b"} Mar 10 00:16:03 crc kubenswrapper[4906]: I0310 00:16:03.148037 4906 generic.go:334] "Generic (PLEG): container finished" podID="20232528-5424-4988-b5c5-52011267a7e4" containerID="8336e4e5a87d14df8bdf132e789b69e30b4934d32a495d0bc21caea4fb7d1a68" exitCode=0 Mar 10 00:16:03 crc kubenswrapper[4906]: I0310 00:16:03.148156 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551696-9x59s" event={"ID":"20232528-5424-4988-b5c5-52011267a7e4","Type":"ContainerDied","Data":"8336e4e5a87d14df8bdf132e789b69e30b4934d32a495d0bc21caea4fb7d1a68"} Mar 10 00:16:04 crc kubenswrapper[4906]: I0310 00:16:04.520071 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551696-9x59s" Mar 10 00:16:04 crc kubenswrapper[4906]: I0310 00:16:04.606783 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8gpd\" (UniqueName: \"kubernetes.io/projected/20232528-5424-4988-b5c5-52011267a7e4-kube-api-access-z8gpd\") pod \"20232528-5424-4988-b5c5-52011267a7e4\" (UID: \"20232528-5424-4988-b5c5-52011267a7e4\") " Mar 10 00:16:04 crc kubenswrapper[4906]: I0310 00:16:04.617154 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20232528-5424-4988-b5c5-52011267a7e4-kube-api-access-z8gpd" (OuterVolumeSpecName: "kube-api-access-z8gpd") pod "20232528-5424-4988-b5c5-52011267a7e4" (UID: "20232528-5424-4988-b5c5-52011267a7e4"). InnerVolumeSpecName "kube-api-access-z8gpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:16:04 crc kubenswrapper[4906]: I0310 00:16:04.708601 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8gpd\" (UniqueName: \"kubernetes.io/projected/20232528-5424-4988-b5c5-52011267a7e4-kube-api-access-z8gpd\") on node \"crc\" DevicePath \"\"" Mar 10 00:16:05 crc kubenswrapper[4906]: I0310 00:16:05.167677 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551696-9x59s" event={"ID":"20232528-5424-4988-b5c5-52011267a7e4","Type":"ContainerDied","Data":"337eb8f7153aee9f74df13d93b2f442988799932f036285a885a06632e3bba1b"} Mar 10 00:16:05 crc kubenswrapper[4906]: I0310 00:16:05.167757 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="337eb8f7153aee9f74df13d93b2f442988799932f036285a885a06632e3bba1b" Mar 10 00:16:05 crc kubenswrapper[4906]: I0310 00:16:05.167771 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551696-9x59s" Mar 10 00:16:05 crc kubenswrapper[4906]: I0310 00:16:05.613201 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551690-8jrvf"] Mar 10 00:16:05 crc kubenswrapper[4906]: I0310 00:16:05.626878 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551690-8jrvf"] Mar 10 00:16:06 crc kubenswrapper[4906]: I0310 00:16:06.594222 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d18b4f28-fc85-4aaf-af80-4272b80ef138" path="/var/lib/kubelet/pods/d18b4f28-fc85-4aaf-af80-4272b80ef138/volumes" Mar 10 00:16:15 crc kubenswrapper[4906]: I0310 00:16:15.030540 4906 scope.go:117] "RemoveContainer" containerID="82c8cc3ef8c5b872fd0f4d97b18b2e2f95cbc6b451ee0407027a5c758f1f15d6" Mar 10 00:16:15 crc kubenswrapper[4906]: I0310 00:16:15.119442 4906 scope.go:117] "RemoveContainer" containerID="ce97fe799a72e7dbdd8f5268ed7002df541d96660f4f6d940fc047b5ee3643d7" Mar 10 00:16:30 crc kubenswrapper[4906]: I0310 00:16:30.502575 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:16:30 crc kubenswrapper[4906]: I0310 00:16:30.505740 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:17:00 crc kubenswrapper[4906]: I0310 00:17:00.502658 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:17:00 crc kubenswrapper[4906]: I0310 00:17:00.503379 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:17:00 crc kubenswrapper[4906]: I0310 00:17:00.503438 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:17:00 crc kubenswrapper[4906]: I0310 00:17:00.504326 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ff7670643966cdfab75a0c844ba14d4d3b7f4816f0572260c65b9eddbbe62eaa"} pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:17:00 crc kubenswrapper[4906]: I0310 00:17:00.504428 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" containerID="cri-o://ff7670643966cdfab75a0c844ba14d4d3b7f4816f0572260c65b9eddbbe62eaa" gracePeriod=600 Mar 10 00:17:00 crc kubenswrapper[4906]: I0310 00:17:00.650371 4906 generic.go:334] "Generic (PLEG): container finished" podID="72d61d35-0a64-45a5-8df3-9c429727deba" containerID="ff7670643966cdfab75a0c844ba14d4d3b7f4816f0572260c65b9eddbbe62eaa" exitCode=0 Mar 10 00:17:00 crc kubenswrapper[4906]: I0310 00:17:00.650433 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerDied","Data":"ff7670643966cdfab75a0c844ba14d4d3b7f4816f0572260c65b9eddbbe62eaa"} Mar 10 00:17:00 crc kubenswrapper[4906]: I0310 00:17:00.650479 4906 scope.go:117] "RemoveContainer" containerID="8e93da60220c7a28ef01def4f9a5029323cddf710d54a9d14b770a8f7137b36e" Mar 10 00:17:01 crc kubenswrapper[4906]: I0310 00:17:01.661679 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerStarted","Data":"ebdf53f2bff9bcf61abafe2c602bdab6ed5145f512fe38143bc7c112b9a35137"} Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.233259 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hskrb"] Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.236427 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovn-controller" containerID="cri-o://6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a" gracePeriod=30 Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.236591 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="kube-rbac-proxy-node" containerID="cri-o://911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6" gracePeriod=30 Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.236619 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3" gracePeriod=30 Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.236907 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="sbdb" containerID="cri-o://410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7" gracePeriod=30 Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.236930 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="northd" containerID="cri-o://f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878" gracePeriod=30 Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.237021 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovn-acl-logging" containerID="cri-o://58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be" gracePeriod=30 Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.236543 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="nbdb" containerID="cri-o://c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21" gracePeriod=30 Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.308839 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovnkube-controller" containerID="cri-o://5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda" gracePeriod=30 Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.659077 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hskrb_c9f87520-6105-4b6f-ba5a-a232b5dc24c0/ovnkube-controller/2.log" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.662842 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hskrb_c9f87520-6105-4b6f-ba5a-a232b5dc24c0/ovn-acl-logging/0.log" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.663940 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hskrb_c9f87520-6105-4b6f-ba5a-a232b5dc24c0/ovn-controller/0.log" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.664800 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.744070 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t88kj"] Mar 10 00:17:46 crc kubenswrapper[4906]: E0310 00:17:46.744911 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovn-controller" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.745011 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovn-controller" Mar 10 00:17:46 crc kubenswrapper[4906]: E0310 00:17:46.745488 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="nbdb" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.745610 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="nbdb" Mar 10 00:17:46 crc kubenswrapper[4906]: E0310 00:17:46.745711 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovnkube-controller" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.745838 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovnkube-controller" Mar 10 00:17:46 crc kubenswrapper[4906]: E0310 00:17:46.745919 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovnkube-controller" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.745993 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovnkube-controller" Mar 10 00:17:46 crc kubenswrapper[4906]: E0310 00:17:46.746175 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="kube-rbac-proxy-node" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.746249 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="kube-rbac-proxy-node" Mar 10 00:17:46 crc kubenswrapper[4906]: E0310 00:17:46.746337 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20232528-5424-4988-b5c5-52011267a7e4" containerName="oc" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.746409 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="20232528-5424-4988-b5c5-52011267a7e4" containerName="oc" Mar 10 00:17:46 crc kubenswrapper[4906]: E0310 00:17:46.746482 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="sbdb" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.746554 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="sbdb" Mar 10 00:17:46 crc kubenswrapper[4906]: E0310 00:17:46.746632 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="kubecfg-setup" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.746774 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="kubecfg-setup" Mar 10 00:17:46 crc kubenswrapper[4906]: E0310 00:17:46.746877 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovn-acl-logging" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.746957 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovn-acl-logging" Mar 10 00:17:46 crc kubenswrapper[4906]: E0310 00:17:46.747029 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovnkube-controller" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.747097 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovnkube-controller" Mar 10 00:17:46 crc kubenswrapper[4906]: E0310 00:17:46.747167 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.747245 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 00:17:46 crc kubenswrapper[4906]: E0310 00:17:46.747332 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="northd" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.747402 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="northd" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.747589 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovnkube-controller" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.747695 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovnkube-controller" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.747775 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="northd" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.747852 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovnkube-controller" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.747928 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovn-controller" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.747999 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="nbdb" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.748326 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="kube-rbac-proxy-node" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.748409 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.748481 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="sbdb" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.748558 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="20232528-5424-4988-b5c5-52011267a7e4" containerName="oc" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.748632 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovn-acl-logging" Mar 10 00:17:46 crc kubenswrapper[4906]: E0310 00:17:46.748848 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovnkube-controller" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.748966 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovnkube-controller" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.749158 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerName="ovnkube-controller" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.751586 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.855454 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-kubelet\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.855543 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovnkube-script-lib\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.855590 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-run-ovn-kubernetes\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.855624 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-var-lib-openvswitch\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.855726 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-cni-netd\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.855752 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-run-netns\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.855785 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovnkube-config\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.855807 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-etc-openvswitch\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.855850 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbtkp\" (UniqueName: \"kubernetes.io/projected/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-kube-api-access-bbtkp\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.855911 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.855985 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.855956 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.855995 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.855942 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.856169 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-node-log" (OuterVolumeSpecName: "node-log") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.856174 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.856713 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-node-log\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.856797 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-env-overrides\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.856841 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.856821 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.856847 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-systemd\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.856955 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-log-socket\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.857103 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-log-socket" (OuterVolumeSpecName: "log-socket") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.857029 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovn-node-metrics-cert\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.857256 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-ovn\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.857194 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.857295 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-cni-bin\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.857330 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.857349 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-slash\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.857381 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.857382 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.857415 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-slash" (OuterVolumeSpecName: "host-slash") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.857438 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.857409 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-openvswitch\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.857424 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.857554 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-systemd-units\") pod \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\" (UID: \"c9f87520-6105-4b6f-ba5a-a232b5dc24c0\") " Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.857651 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.858037 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-run-ovn-kubernetes\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.858087 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d125115c-01cb-4cfc-9729-93f80c75109b-ovnkube-config\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.858347 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.858432 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-node-log\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.858468 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-log-socket\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.858497 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-systemd-units\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.858654 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d125115c-01cb-4cfc-9729-93f80c75109b-ovnkube-script-lib\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.858692 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-slash\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.858718 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-cni-bin\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.858947 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-etc-openvswitch\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.859057 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-kubelet\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.859212 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-var-lib-openvswitch\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.859301 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-run-systemd\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.859385 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-run-netns\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.859426 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d125115c-01cb-4cfc-9729-93f80c75109b-ovn-node-metrics-cert\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.859505 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvd68\" (UniqueName: \"kubernetes.io/projected/d125115c-01cb-4cfc-9729-93f80c75109b-kube-api-access-wvd68\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.859555 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-run-openvswitch\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.859678 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d125115c-01cb-4cfc-9729-93f80c75109b-env-overrides\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.859816 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-run-ovn\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.859879 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-cni-netd\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.860058 4906 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.860089 4906 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.860114 4906 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-slash\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.860134 4906 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.860154 4906 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.860173 4906 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.860191 4906 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.860209 4906 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.860231 4906 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.860253 4906 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.860271 4906 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.860288 4906 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.860307 4906 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.860324 4906 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.860345 4906 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-node-log\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.860363 4906 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.860383 4906 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-log-socket\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.864180 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.864785 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-kube-api-access-bbtkp" (OuterVolumeSpecName: "kube-api-access-bbtkp") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "kube-api-access-bbtkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.880442 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "c9f87520-6105-4b6f-ba5a-a232b5dc24c0" (UID: "c9f87520-6105-4b6f-ba5a-a232b5dc24c0"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.961967 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvd68\" (UniqueName: \"kubernetes.io/projected/d125115c-01cb-4cfc-9729-93f80c75109b-kube-api-access-wvd68\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.962611 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-run-openvswitch\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.962716 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d125115c-01cb-4cfc-9729-93f80c75109b-env-overrides\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.962782 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-run-ovn\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.962837 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-run-ovn\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.962849 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-cni-netd\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.962776 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-run-openvswitch\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.962941 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-cni-netd\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963126 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-run-ovn-kubernetes\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963206 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d125115c-01cb-4cfc-9729-93f80c75109b-ovnkube-config\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963226 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-run-ovn-kubernetes\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963257 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963339 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-node-log\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963386 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-systemd-units\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963429 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-log-socket\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963465 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963519 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d125115c-01cb-4cfc-9729-93f80c75109b-ovnkube-script-lib\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963545 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-systemd-units\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963565 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-cni-bin\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963572 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-log-socket\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963604 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-slash\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963516 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-node-log\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963720 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-etc-openvswitch\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963772 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-slash\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963790 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-kubelet\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963824 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-etc-openvswitch\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963778 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-cni-bin\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963877 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-var-lib-openvswitch\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963920 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-var-lib-openvswitch\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963894 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-kubelet\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963841 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d125115c-01cb-4cfc-9729-93f80c75109b-env-overrides\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963944 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-run-systemd\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.963980 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-run-systemd\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.964016 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-run-netns\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.964065 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d125115c-01cb-4cfc-9729-93f80c75109b-ovn-node-metrics-cert\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.964145 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d125115c-01cb-4cfc-9729-93f80c75109b-host-run-netns\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.964174 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbtkp\" (UniqueName: \"kubernetes.io/projected/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-kube-api-access-bbtkp\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.964208 4906 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.964242 4906 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c9f87520-6105-4b6f-ba5a-a232b5dc24c0-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.964416 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d125115c-01cb-4cfc-9729-93f80c75109b-ovnkube-config\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.964734 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d125115c-01cb-4cfc-9729-93f80c75109b-ovnkube-script-lib\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.969689 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d125115c-01cb-4cfc-9729-93f80c75109b-ovn-node-metrics-cert\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:46 crc kubenswrapper[4906]: I0310 00:17:46.992084 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvd68\" (UniqueName: \"kubernetes.io/projected/d125115c-01cb-4cfc-9729-93f80c75109b-kube-api-access-wvd68\") pod \"ovnkube-node-t88kj\" (UID: \"d125115c-01cb-4cfc-9729-93f80c75109b\") " pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.024230 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hskrb_c9f87520-6105-4b6f-ba5a-a232b5dc24c0/ovnkube-controller/2.log" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.028028 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hskrb_c9f87520-6105-4b6f-ba5a-a232b5dc24c0/ovn-acl-logging/0.log" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.029932 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hskrb_c9f87520-6105-4b6f-ba5a-a232b5dc24c0/ovn-controller/0.log" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.030882 4906 generic.go:334] "Generic (PLEG): container finished" podID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerID="5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda" exitCode=0 Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.030943 4906 generic.go:334] "Generic (PLEG): container finished" podID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerID="410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7" exitCode=0 Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.030970 4906 generic.go:334] "Generic (PLEG): container finished" podID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerID="c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21" exitCode=0 Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.030995 4906 generic.go:334] "Generic (PLEG): container finished" podID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerID="f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878" exitCode=0 Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031014 4906 generic.go:334] "Generic (PLEG): container finished" podID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerID="fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3" exitCode=0 Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031032 4906 generic.go:334] "Generic (PLEG): container finished" podID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerID="911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6" exitCode=0 Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031053 4906 generic.go:334] "Generic (PLEG): container finished" podID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerID="58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be" exitCode=143 Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031072 4906 generic.go:334] "Generic (PLEG): container finished" podID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" containerID="6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a" exitCode=143 Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031067 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerDied","Data":"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031167 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerDied","Data":"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031209 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerDied","Data":"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031242 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerDied","Data":"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031274 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerDied","Data":"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031303 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerDied","Data":"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031317 4906 scope.go:117] "RemoveContainer" containerID="5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031331 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031465 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031484 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031500 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031518 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031533 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031547 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031562 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031578 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031601 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerDied","Data":"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031628 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031679 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031695 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031712 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031727 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031742 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031756 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031770 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031788 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031805 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031827 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerDied","Data":"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031851 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031869 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031886 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031924 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031942 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031960 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031976 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.031991 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.032006 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.032022 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.032043 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" event={"ID":"c9f87520-6105-4b6f-ba5a-a232b5dc24c0","Type":"ContainerDied","Data":"558cd2075ee300f87f012fb60f7aafe0af05a9b68f56a72922916b36cd40d060"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.032069 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.032087 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.032101 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.032116 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.032132 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.032146 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.032162 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.032179 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.032193 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.032208 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.032891 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hskrb" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.035433 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-85dv2_0c494c18-0d46-4e23-8ef5-214938a66a7b/kube-multus/1.log" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.038223 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-85dv2_0c494c18-0d46-4e23-8ef5-214938a66a7b/kube-multus/0.log" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.038291 4906 generic.go:334] "Generic (PLEG): container finished" podID="0c494c18-0d46-4e23-8ef5-214938a66a7b" containerID="97deba93385ce59ea5f63333b0faca22d02265e44d69bbfbb0df409c4f16bef1" exitCode=2 Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.038345 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-85dv2" event={"ID":"0c494c18-0d46-4e23-8ef5-214938a66a7b","Type":"ContainerDied","Data":"97deba93385ce59ea5f63333b0faca22d02265e44d69bbfbb0df409c4f16bef1"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.038376 4906 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a"} Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.040202 4906 scope.go:117] "RemoveContainer" containerID="97deba93385ce59ea5f63333b0faca22d02265e44d69bbfbb0df409c4f16bef1" Mar 10 00:17:47 crc kubenswrapper[4906]: E0310 00:17:47.040607 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-85dv2_openshift-multus(0c494c18-0d46-4e23-8ef5-214938a66a7b)\"" pod="openshift-multus/multus-85dv2" podUID="0c494c18-0d46-4e23-8ef5-214938a66a7b" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.065496 4906 scope.go:117] "RemoveContainer" containerID="6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.073295 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.098820 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hskrb"] Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.106515 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hskrb"] Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.112131 4906 scope.go:117] "RemoveContainer" containerID="410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7" Mar 10 00:17:47 crc kubenswrapper[4906]: W0310 00:17:47.128158 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd125115c_01cb_4cfc_9729_93f80c75109b.slice/crio-9625c77b1ee9bc7dd076052e7b6ddb42ee8969595d21208b6b6a93506ff6ec13 WatchSource:0}: Error finding container 9625c77b1ee9bc7dd076052e7b6ddb42ee8969595d21208b6b6a93506ff6ec13: Status 404 returned error can't find the container with id 9625c77b1ee9bc7dd076052e7b6ddb42ee8969595d21208b6b6a93506ff6ec13 Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.134364 4906 scope.go:117] "RemoveContainer" containerID="c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.163716 4906 scope.go:117] "RemoveContainer" containerID="f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.203561 4906 scope.go:117] "RemoveContainer" containerID="fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.231397 4906 scope.go:117] "RemoveContainer" containerID="911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.259706 4906 scope.go:117] "RemoveContainer" containerID="58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.293606 4906 scope.go:117] "RemoveContainer" containerID="6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.325368 4906 scope.go:117] "RemoveContainer" containerID="52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.419627 4906 scope.go:117] "RemoveContainer" containerID="5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda" Mar 10 00:17:47 crc kubenswrapper[4906]: E0310 00:17:47.420591 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda\": container with ID starting with 5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda not found: ID does not exist" containerID="5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.420701 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda"} err="failed to get container status \"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda\": rpc error: code = NotFound desc = could not find container \"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda\": container with ID starting with 5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.420751 4906 scope.go:117] "RemoveContainer" containerID="6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25" Mar 10 00:17:47 crc kubenswrapper[4906]: E0310 00:17:47.421559 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\": container with ID starting with 6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25 not found: ID does not exist" containerID="6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.421738 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25"} err="failed to get container status \"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\": rpc error: code = NotFound desc = could not find container \"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\": container with ID starting with 6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.421794 4906 scope.go:117] "RemoveContainer" containerID="410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7" Mar 10 00:17:47 crc kubenswrapper[4906]: E0310 00:17:47.422481 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\": container with ID starting with 410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7 not found: ID does not exist" containerID="410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.422528 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7"} err="failed to get container status \"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\": rpc error: code = NotFound desc = could not find container \"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\": container with ID starting with 410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.422556 4906 scope.go:117] "RemoveContainer" containerID="c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21" Mar 10 00:17:47 crc kubenswrapper[4906]: E0310 00:17:47.424194 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\": container with ID starting with c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21 not found: ID does not exist" containerID="c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.424227 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21"} err="failed to get container status \"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\": rpc error: code = NotFound desc = could not find container \"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\": container with ID starting with c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.424258 4906 scope.go:117] "RemoveContainer" containerID="f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878" Mar 10 00:17:47 crc kubenswrapper[4906]: E0310 00:17:47.424939 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\": container with ID starting with f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878 not found: ID does not exist" containerID="f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.424992 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878"} err="failed to get container status \"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\": rpc error: code = NotFound desc = could not find container \"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\": container with ID starting with f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.425024 4906 scope.go:117] "RemoveContainer" containerID="fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3" Mar 10 00:17:47 crc kubenswrapper[4906]: E0310 00:17:47.425736 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\": container with ID starting with fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3 not found: ID does not exist" containerID="fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.425814 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3"} err="failed to get container status \"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\": rpc error: code = NotFound desc = could not find container \"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\": container with ID starting with fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.425857 4906 scope.go:117] "RemoveContainer" containerID="911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6" Mar 10 00:17:47 crc kubenswrapper[4906]: E0310 00:17:47.426704 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\": container with ID starting with 911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6 not found: ID does not exist" containerID="911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.426810 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6"} err="failed to get container status \"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\": rpc error: code = NotFound desc = could not find container \"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\": container with ID starting with 911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.426842 4906 scope.go:117] "RemoveContainer" containerID="58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be" Mar 10 00:17:47 crc kubenswrapper[4906]: E0310 00:17:47.427534 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\": container with ID starting with 58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be not found: ID does not exist" containerID="58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.427582 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be"} err="failed to get container status \"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\": rpc error: code = NotFound desc = could not find container \"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\": container with ID starting with 58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.427611 4906 scope.go:117] "RemoveContainer" containerID="6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a" Mar 10 00:17:47 crc kubenswrapper[4906]: E0310 00:17:47.428172 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\": container with ID starting with 6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a not found: ID does not exist" containerID="6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.428229 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a"} err="failed to get container status \"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\": rpc error: code = NotFound desc = could not find container \"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\": container with ID starting with 6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.428264 4906 scope.go:117] "RemoveContainer" containerID="52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80" Mar 10 00:17:47 crc kubenswrapper[4906]: E0310 00:17:47.428973 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\": container with ID starting with 52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80 not found: ID does not exist" containerID="52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.429131 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80"} err="failed to get container status \"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\": rpc error: code = NotFound desc = could not find container \"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\": container with ID starting with 52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.429205 4906 scope.go:117] "RemoveContainer" containerID="5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.430082 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda"} err="failed to get container status \"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda\": rpc error: code = NotFound desc = could not find container \"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda\": container with ID starting with 5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.430132 4906 scope.go:117] "RemoveContainer" containerID="6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.430860 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25"} err="failed to get container status \"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\": rpc error: code = NotFound desc = could not find container \"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\": container with ID starting with 6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.430900 4906 scope.go:117] "RemoveContainer" containerID="410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.431733 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7"} err="failed to get container status \"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\": rpc error: code = NotFound desc = could not find container \"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\": container with ID starting with 410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.431919 4906 scope.go:117] "RemoveContainer" containerID="c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.433785 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21"} err="failed to get container status \"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\": rpc error: code = NotFound desc = could not find container \"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\": container with ID starting with c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.433832 4906 scope.go:117] "RemoveContainer" containerID="f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.435560 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878"} err="failed to get container status \"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\": rpc error: code = NotFound desc = could not find container \"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\": container with ID starting with f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.435627 4906 scope.go:117] "RemoveContainer" containerID="fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.436189 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3"} err="failed to get container status \"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\": rpc error: code = NotFound desc = could not find container \"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\": container with ID starting with fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.436247 4906 scope.go:117] "RemoveContainer" containerID="911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.436915 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6"} err="failed to get container status \"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\": rpc error: code = NotFound desc = could not find container \"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\": container with ID starting with 911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.436973 4906 scope.go:117] "RemoveContainer" containerID="58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.438124 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be"} err="failed to get container status \"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\": rpc error: code = NotFound desc = could not find container \"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\": container with ID starting with 58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.438169 4906 scope.go:117] "RemoveContainer" containerID="6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.438811 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a"} err="failed to get container status \"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\": rpc error: code = NotFound desc = could not find container \"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\": container with ID starting with 6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.438858 4906 scope.go:117] "RemoveContainer" containerID="52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.439413 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80"} err="failed to get container status \"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\": rpc error: code = NotFound desc = could not find container \"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\": container with ID starting with 52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.439469 4906 scope.go:117] "RemoveContainer" containerID="5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.440341 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda"} err="failed to get container status \"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda\": rpc error: code = NotFound desc = could not find container \"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda\": container with ID starting with 5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.440397 4906 scope.go:117] "RemoveContainer" containerID="6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.440991 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25"} err="failed to get container status \"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\": rpc error: code = NotFound desc = could not find container \"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\": container with ID starting with 6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.441064 4906 scope.go:117] "RemoveContainer" containerID="410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.441797 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7"} err="failed to get container status \"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\": rpc error: code = NotFound desc = could not find container \"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\": container with ID starting with 410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.441841 4906 scope.go:117] "RemoveContainer" containerID="c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.442459 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21"} err="failed to get container status \"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\": rpc error: code = NotFound desc = could not find container \"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\": container with ID starting with c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.442568 4906 scope.go:117] "RemoveContainer" containerID="f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.443713 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878"} err="failed to get container status \"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\": rpc error: code = NotFound desc = could not find container \"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\": container with ID starting with f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.443769 4906 scope.go:117] "RemoveContainer" containerID="fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.444101 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3"} err="failed to get container status \"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\": rpc error: code = NotFound desc = could not find container \"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\": container with ID starting with fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.444136 4906 scope.go:117] "RemoveContainer" containerID="911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.444524 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6"} err="failed to get container status \"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\": rpc error: code = NotFound desc = could not find container \"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\": container with ID starting with 911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.444551 4906 scope.go:117] "RemoveContainer" containerID="58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.446028 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be"} err="failed to get container status \"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\": rpc error: code = NotFound desc = could not find container \"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\": container with ID starting with 58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.446056 4906 scope.go:117] "RemoveContainer" containerID="6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.446493 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a"} err="failed to get container status \"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\": rpc error: code = NotFound desc = could not find container \"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\": container with ID starting with 6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.446533 4906 scope.go:117] "RemoveContainer" containerID="52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.446978 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80"} err="failed to get container status \"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\": rpc error: code = NotFound desc = could not find container \"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\": container with ID starting with 52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.447011 4906 scope.go:117] "RemoveContainer" containerID="5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.447369 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda"} err="failed to get container status \"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda\": rpc error: code = NotFound desc = could not find container \"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda\": container with ID starting with 5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.447403 4906 scope.go:117] "RemoveContainer" containerID="6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.447867 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25"} err="failed to get container status \"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\": rpc error: code = NotFound desc = could not find container \"6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25\": container with ID starting with 6fe06556da9fc87aaebca8e8b6c030f6568bc352414267339059a3f9a1faff25 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.447901 4906 scope.go:117] "RemoveContainer" containerID="410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.448186 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7"} err="failed to get container status \"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\": rpc error: code = NotFound desc = could not find container \"410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7\": container with ID starting with 410b0c40f63f43dcd6fb50a359080c2c076f62858b948841cdbaf8c354d198f7 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.448222 4906 scope.go:117] "RemoveContainer" containerID="c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.448730 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21"} err="failed to get container status \"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\": rpc error: code = NotFound desc = could not find container \"c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21\": container with ID starting with c09a8ef5244efb12dbf1782379b9633e2c35e936c2d1df7d715481c9b3bf0b21 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.448769 4906 scope.go:117] "RemoveContainer" containerID="f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.449074 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878"} err="failed to get container status \"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\": rpc error: code = NotFound desc = could not find container \"f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878\": container with ID starting with f0d58d338e79d263fe3e7d3ca54f65b5ff80376c0c094e3f51c66c00ad8ae878 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.449106 4906 scope.go:117] "RemoveContainer" containerID="fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.449513 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3"} err="failed to get container status \"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\": rpc error: code = NotFound desc = could not find container \"fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3\": container with ID starting with fee1a9d86245fa50b0fdbc1fd897326de61c74df8a05667d2dfca5a397a105c3 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.449553 4906 scope.go:117] "RemoveContainer" containerID="911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.450059 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6"} err="failed to get container status \"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\": rpc error: code = NotFound desc = could not find container \"911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6\": container with ID starting with 911516785da6f740d1b7e21cd68006ed12e6e826d87405543b150a5b6331deb6 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.450094 4906 scope.go:117] "RemoveContainer" containerID="58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.450418 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be"} err="failed to get container status \"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\": rpc error: code = NotFound desc = could not find container \"58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be\": container with ID starting with 58958cb3a6a43945e2e85351e464979217571ace2e8aff3602776ecb003df0be not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.450453 4906 scope.go:117] "RemoveContainer" containerID="6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.450807 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a"} err="failed to get container status \"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\": rpc error: code = NotFound desc = could not find container \"6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a\": container with ID starting with 6f75ed8b74bcf143893e0244f5d561e39792f3c066b4ea296e5e75dd0999a43a not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.450840 4906 scope.go:117] "RemoveContainer" containerID="52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.451190 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80"} err="failed to get container status \"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\": rpc error: code = NotFound desc = could not find container \"52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80\": container with ID starting with 52b72675ce4cf6e8d65d9efec6f1948eca320b044ef1dbb4ea3a4de71ccd5a80 not found: ID does not exist" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.451225 4906 scope.go:117] "RemoveContainer" containerID="5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda" Mar 10 00:17:47 crc kubenswrapper[4906]: I0310 00:17:47.451594 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda"} err="failed to get container status \"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda\": rpc error: code = NotFound desc = could not find container \"5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda\": container with ID starting with 5e998996c1ab592a23043a363deaef69007f550b1ab7d626bc644bc82db78bda not found: ID does not exist" Mar 10 00:17:48 crc kubenswrapper[4906]: I0310 00:17:48.052048 4906 generic.go:334] "Generic (PLEG): container finished" podID="d125115c-01cb-4cfc-9729-93f80c75109b" containerID="324457d05fae3be739d08bc21afe5ac92d3c4226aad29e4054357ca93cd52055" exitCode=0 Mar 10 00:17:48 crc kubenswrapper[4906]: I0310 00:17:48.052201 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" event={"ID":"d125115c-01cb-4cfc-9729-93f80c75109b","Type":"ContainerDied","Data":"324457d05fae3be739d08bc21afe5ac92d3c4226aad29e4054357ca93cd52055"} Mar 10 00:17:48 crc kubenswrapper[4906]: I0310 00:17:48.052615 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" event={"ID":"d125115c-01cb-4cfc-9729-93f80c75109b","Type":"ContainerStarted","Data":"9625c77b1ee9bc7dd076052e7b6ddb42ee8969595d21208b6b6a93506ff6ec13"} Mar 10 00:17:48 crc kubenswrapper[4906]: I0310 00:17:48.592884 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9f87520-6105-4b6f-ba5a-a232b5dc24c0" path="/var/lib/kubelet/pods/c9f87520-6105-4b6f-ba5a-a232b5dc24c0/volumes" Mar 10 00:17:49 crc kubenswrapper[4906]: I0310 00:17:49.067531 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" event={"ID":"d125115c-01cb-4cfc-9729-93f80c75109b","Type":"ContainerStarted","Data":"d552e5df8b4300efebd41e6d5c767c3da0aebcc8bff106561feaee53f6e00198"} Mar 10 00:17:49 crc kubenswrapper[4906]: I0310 00:17:49.068157 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" event={"ID":"d125115c-01cb-4cfc-9729-93f80c75109b","Type":"ContainerStarted","Data":"c70a1ee981306485ea2799da369381dac86fc6a5b07da2ea4ff3602da9487e12"} Mar 10 00:17:49 crc kubenswrapper[4906]: I0310 00:17:49.068188 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" event={"ID":"d125115c-01cb-4cfc-9729-93f80c75109b","Type":"ContainerStarted","Data":"82ab30a4381dc0459e3483e91137f1b5f732d1749d91dfeb75ba0613a9aa7dbc"} Mar 10 00:17:49 crc kubenswrapper[4906]: I0310 00:17:49.068208 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" event={"ID":"d125115c-01cb-4cfc-9729-93f80c75109b","Type":"ContainerStarted","Data":"62e8ff2a58a2197051c51351cdce99934f4ce15933d2a13bd4fda166fe38557f"} Mar 10 00:17:49 crc kubenswrapper[4906]: I0310 00:17:49.068272 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" event={"ID":"d125115c-01cb-4cfc-9729-93f80c75109b","Type":"ContainerStarted","Data":"fac5a3d812798ce9e88bd1bb00a3448fe3ce8d910f65a8ad97fd3e5a8b242d85"} Mar 10 00:17:50 crc kubenswrapper[4906]: I0310 00:17:50.081616 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" event={"ID":"d125115c-01cb-4cfc-9729-93f80c75109b","Type":"ContainerStarted","Data":"675b4c8ca5b306519d5a354cd38bbee4bd83ccda953f0662d132136c6b59bfd7"} Mar 10 00:17:52 crc kubenswrapper[4906]: I0310 00:17:52.106489 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" event={"ID":"d125115c-01cb-4cfc-9729-93f80c75109b","Type":"ContainerStarted","Data":"48b969768abfe71c336f88aabba9d1c631e02207f3e91c8f3068958f8f8463de"} Mar 10 00:17:54 crc kubenswrapper[4906]: I0310 00:17:54.135494 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" event={"ID":"d125115c-01cb-4cfc-9729-93f80c75109b","Type":"ContainerStarted","Data":"ff674a2fe5850777f7395b182ba5418eba51afe428a7aa780210ebd0bb2ca1a6"} Mar 10 00:17:54 crc kubenswrapper[4906]: I0310 00:17:54.136405 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:54 crc kubenswrapper[4906]: I0310 00:17:54.175019 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:54 crc kubenswrapper[4906]: I0310 00:17:54.186551 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" podStartSLOduration=8.186530499 podStartE2EDuration="8.186530499s" podCreationTimestamp="2026-03-10 00:17:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:17:54.18267018 +0000 UTC m=+700.330565292" watchObservedRunningTime="2026-03-10 00:17:54.186530499 +0000 UTC m=+700.334425611" Mar 10 00:17:55 crc kubenswrapper[4906]: I0310 00:17:55.144339 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:55 crc kubenswrapper[4906]: I0310 00:17:55.144420 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:55 crc kubenswrapper[4906]: I0310 00:17:55.178676 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:17:58 crc kubenswrapper[4906]: I0310 00:17:58.578063 4906 scope.go:117] "RemoveContainer" containerID="97deba93385ce59ea5f63333b0faca22d02265e44d69bbfbb0df409c4f16bef1" Mar 10 00:17:59 crc kubenswrapper[4906]: I0310 00:17:59.182106 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-85dv2_0c494c18-0d46-4e23-8ef5-214938a66a7b/kube-multus/1.log" Mar 10 00:17:59 crc kubenswrapper[4906]: I0310 00:17:59.183193 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-85dv2_0c494c18-0d46-4e23-8ef5-214938a66a7b/kube-multus/0.log" Mar 10 00:17:59 crc kubenswrapper[4906]: I0310 00:17:59.183264 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-85dv2" event={"ID":"0c494c18-0d46-4e23-8ef5-214938a66a7b","Type":"ContainerStarted","Data":"c21138cd9bd997b58e49a11723321428ef41d4e1edd691dddec99a35c1efbf1e"} Mar 10 00:18:00 crc kubenswrapper[4906]: I0310 00:18:00.143367 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551698-z86s5"] Mar 10 00:18:00 crc kubenswrapper[4906]: I0310 00:18:00.145017 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551698-z86s5" Mar 10 00:18:00 crc kubenswrapper[4906]: I0310 00:18:00.148010 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:18:00 crc kubenswrapper[4906]: I0310 00:18:00.150195 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ktdr6" Mar 10 00:18:00 crc kubenswrapper[4906]: I0310 00:18:00.153116 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:18:00 crc kubenswrapper[4906]: I0310 00:18:00.159962 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551698-z86s5"] Mar 10 00:18:00 crc kubenswrapper[4906]: I0310 00:18:00.178856 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tvbj\" (UniqueName: \"kubernetes.io/projected/b3f8803e-330e-4afb-af29-1c613251cd1c-kube-api-access-7tvbj\") pod \"auto-csr-approver-29551698-z86s5\" (UID: \"b3f8803e-330e-4afb-af29-1c613251cd1c\") " pod="openshift-infra/auto-csr-approver-29551698-z86s5" Mar 10 00:18:00 crc kubenswrapper[4906]: I0310 00:18:00.280401 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tvbj\" (UniqueName: \"kubernetes.io/projected/b3f8803e-330e-4afb-af29-1c613251cd1c-kube-api-access-7tvbj\") pod \"auto-csr-approver-29551698-z86s5\" (UID: \"b3f8803e-330e-4afb-af29-1c613251cd1c\") " pod="openshift-infra/auto-csr-approver-29551698-z86s5" Mar 10 00:18:00 crc kubenswrapper[4906]: I0310 00:18:00.306048 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tvbj\" (UniqueName: \"kubernetes.io/projected/b3f8803e-330e-4afb-af29-1c613251cd1c-kube-api-access-7tvbj\") pod \"auto-csr-approver-29551698-z86s5\" (UID: \"b3f8803e-330e-4afb-af29-1c613251cd1c\") " pod="openshift-infra/auto-csr-approver-29551698-z86s5" Mar 10 00:18:00 crc kubenswrapper[4906]: I0310 00:18:00.468192 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551698-z86s5" Mar 10 00:18:00 crc kubenswrapper[4906]: I0310 00:18:00.813362 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551698-z86s5"] Mar 10 00:18:01 crc kubenswrapper[4906]: I0310 00:18:01.200104 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551698-z86s5" event={"ID":"b3f8803e-330e-4afb-af29-1c613251cd1c","Type":"ContainerStarted","Data":"4d4d5fb1b412d47aa2f65724cf8b40c2918a146ff09c98e57bc0c66890421b88"} Mar 10 00:18:03 crc kubenswrapper[4906]: I0310 00:18:03.220371 4906 generic.go:334] "Generic (PLEG): container finished" podID="b3f8803e-330e-4afb-af29-1c613251cd1c" containerID="32ed69827b44bfbe64167bcb25bdcb84d7ef40d25d70f4386d48e7176763354b" exitCode=0 Mar 10 00:18:03 crc kubenswrapper[4906]: I0310 00:18:03.220577 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551698-z86s5" event={"ID":"b3f8803e-330e-4afb-af29-1c613251cd1c","Type":"ContainerDied","Data":"32ed69827b44bfbe64167bcb25bdcb84d7ef40d25d70f4386d48e7176763354b"} Mar 10 00:18:05 crc kubenswrapper[4906]: I0310 00:18:05.099933 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551698-z86s5" Mar 10 00:18:05 crc kubenswrapper[4906]: I0310 00:18:05.156628 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tvbj\" (UniqueName: \"kubernetes.io/projected/b3f8803e-330e-4afb-af29-1c613251cd1c-kube-api-access-7tvbj\") pod \"b3f8803e-330e-4afb-af29-1c613251cd1c\" (UID: \"b3f8803e-330e-4afb-af29-1c613251cd1c\") " Mar 10 00:18:05 crc kubenswrapper[4906]: I0310 00:18:05.166487 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3f8803e-330e-4afb-af29-1c613251cd1c-kube-api-access-7tvbj" (OuterVolumeSpecName: "kube-api-access-7tvbj") pod "b3f8803e-330e-4afb-af29-1c613251cd1c" (UID: "b3f8803e-330e-4afb-af29-1c613251cd1c"). InnerVolumeSpecName "kube-api-access-7tvbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:18:05 crc kubenswrapper[4906]: I0310 00:18:05.240439 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551698-z86s5" event={"ID":"b3f8803e-330e-4afb-af29-1c613251cd1c","Type":"ContainerDied","Data":"4d4d5fb1b412d47aa2f65724cf8b40c2918a146ff09c98e57bc0c66890421b88"} Mar 10 00:18:05 crc kubenswrapper[4906]: I0310 00:18:05.240514 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d4d5fb1b412d47aa2f65724cf8b40c2918a146ff09c98e57bc0c66890421b88" Mar 10 00:18:05 crc kubenswrapper[4906]: I0310 00:18:05.240535 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551698-z86s5" Mar 10 00:18:05 crc kubenswrapper[4906]: I0310 00:18:05.258129 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tvbj\" (UniqueName: \"kubernetes.io/projected/b3f8803e-330e-4afb-af29-1c613251cd1c-kube-api-access-7tvbj\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:06 crc kubenswrapper[4906]: I0310 00:18:06.192793 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551692-zfzmz"] Mar 10 00:18:06 crc kubenswrapper[4906]: I0310 00:18:06.199413 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551692-zfzmz"] Mar 10 00:18:06 crc kubenswrapper[4906]: I0310 00:18:06.589079 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="580d90a8-af39-47f0-81d8-301d64c29a1c" path="/var/lib/kubelet/pods/580d90a8-af39-47f0-81d8-301d64c29a1c/volumes" Mar 10 00:18:15 crc kubenswrapper[4906]: I0310 00:18:15.238539 4906 scope.go:117] "RemoveContainer" containerID="4968c20beb7d23329043595ef7a842149b3cd4cc66f3f5a5cd7df53b844ba1df" Mar 10 00:18:15 crc kubenswrapper[4906]: I0310 00:18:15.289205 4906 scope.go:117] "RemoveContainer" containerID="7c5df9fb921fc76518f8da175e0a9b4e05f248b292a22ee6bcb600e07d94077a" Mar 10 00:18:16 crc kubenswrapper[4906]: I0310 00:18:16.323667 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-85dv2_0c494c18-0d46-4e23-8ef5-214938a66a7b/kube-multus/1.log" Mar 10 00:18:17 crc kubenswrapper[4906]: I0310 00:18:17.118196 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t88kj" Mar 10 00:18:48 crc kubenswrapper[4906]: I0310 00:18:48.406069 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4cfdg"] Mar 10 00:18:48 crc kubenswrapper[4906]: I0310 00:18:48.407896 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4cfdg" podUID="0161a078-8da8-4080-bd86-1f8adfd0b57c" containerName="registry-server" containerID="cri-o://d7c471fe26d5a03baf3db8c54f5cf5add3436b1e9e3d3ce8798101f4493e513b" gracePeriod=30 Mar 10 00:18:48 crc kubenswrapper[4906]: I0310 00:18:48.580045 4906 generic.go:334] "Generic (PLEG): container finished" podID="0161a078-8da8-4080-bd86-1f8adfd0b57c" containerID="d7c471fe26d5a03baf3db8c54f5cf5add3436b1e9e3d3ce8798101f4493e513b" exitCode=0 Mar 10 00:18:48 crc kubenswrapper[4906]: I0310 00:18:48.581375 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cfdg" event={"ID":"0161a078-8da8-4080-bd86-1f8adfd0b57c","Type":"ContainerDied","Data":"d7c471fe26d5a03baf3db8c54f5cf5add3436b1e9e3d3ce8798101f4493e513b"} Mar 10 00:18:48 crc kubenswrapper[4906]: I0310 00:18:48.760099 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4cfdg" Mar 10 00:18:48 crc kubenswrapper[4906]: I0310 00:18:48.798434 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0161a078-8da8-4080-bd86-1f8adfd0b57c-catalog-content\") pod \"0161a078-8da8-4080-bd86-1f8adfd0b57c\" (UID: \"0161a078-8da8-4080-bd86-1f8adfd0b57c\") " Mar 10 00:18:48 crc kubenswrapper[4906]: I0310 00:18:48.798510 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbv5p\" (UniqueName: \"kubernetes.io/projected/0161a078-8da8-4080-bd86-1f8adfd0b57c-kube-api-access-gbv5p\") pod \"0161a078-8da8-4080-bd86-1f8adfd0b57c\" (UID: \"0161a078-8da8-4080-bd86-1f8adfd0b57c\") " Mar 10 00:18:48 crc kubenswrapper[4906]: I0310 00:18:48.798609 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0161a078-8da8-4080-bd86-1f8adfd0b57c-utilities\") pod \"0161a078-8da8-4080-bd86-1f8adfd0b57c\" (UID: \"0161a078-8da8-4080-bd86-1f8adfd0b57c\") " Mar 10 00:18:48 crc kubenswrapper[4906]: I0310 00:18:48.800161 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0161a078-8da8-4080-bd86-1f8adfd0b57c-utilities" (OuterVolumeSpecName: "utilities") pod "0161a078-8da8-4080-bd86-1f8adfd0b57c" (UID: "0161a078-8da8-4080-bd86-1f8adfd0b57c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:18:48 crc kubenswrapper[4906]: I0310 00:18:48.809191 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0161a078-8da8-4080-bd86-1f8adfd0b57c-kube-api-access-gbv5p" (OuterVolumeSpecName: "kube-api-access-gbv5p") pod "0161a078-8da8-4080-bd86-1f8adfd0b57c" (UID: "0161a078-8da8-4080-bd86-1f8adfd0b57c"). InnerVolumeSpecName "kube-api-access-gbv5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:18:48 crc kubenswrapper[4906]: I0310 00:18:48.847602 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0161a078-8da8-4080-bd86-1f8adfd0b57c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0161a078-8da8-4080-bd86-1f8adfd0b57c" (UID: "0161a078-8da8-4080-bd86-1f8adfd0b57c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:18:48 crc kubenswrapper[4906]: I0310 00:18:48.899870 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0161a078-8da8-4080-bd86-1f8adfd0b57c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:48 crc kubenswrapper[4906]: I0310 00:18:48.899919 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0161a078-8da8-4080-bd86-1f8adfd0b57c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:48 crc kubenswrapper[4906]: I0310 00:18:48.899940 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbv5p\" (UniqueName: \"kubernetes.io/projected/0161a078-8da8-4080-bd86-1f8adfd0b57c-kube-api-access-gbv5p\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:49 crc kubenswrapper[4906]: I0310 00:18:49.588363 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cfdg" event={"ID":"0161a078-8da8-4080-bd86-1f8adfd0b57c","Type":"ContainerDied","Data":"933e4415816dbf10cf8f3c3fe9658c7636764303a253e905a19e9de76ddb9de9"} Mar 10 00:18:49 crc kubenswrapper[4906]: I0310 00:18:49.588420 4906 scope.go:117] "RemoveContainer" containerID="d7c471fe26d5a03baf3db8c54f5cf5add3436b1e9e3d3ce8798101f4493e513b" Mar 10 00:18:49 crc kubenswrapper[4906]: I0310 00:18:49.588545 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4cfdg" Mar 10 00:18:49 crc kubenswrapper[4906]: I0310 00:18:49.626029 4906 scope.go:117] "RemoveContainer" containerID="e3f7359e9f207f873fcec10743ba6cdbf24703c42302ab2dbd7e6039f81062bd" Mar 10 00:18:49 crc kubenswrapper[4906]: I0310 00:18:49.629954 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4cfdg"] Mar 10 00:18:49 crc kubenswrapper[4906]: I0310 00:18:49.633202 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4cfdg"] Mar 10 00:18:49 crc kubenswrapper[4906]: I0310 00:18:49.650075 4906 scope.go:117] "RemoveContainer" containerID="6d048c1b1fb5e5cc9168f47665f16891a77c2179593269f5d368dc895740a4a1" Mar 10 00:18:50 crc kubenswrapper[4906]: I0310 00:18:50.586373 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0161a078-8da8-4080-bd86-1f8adfd0b57c" path="/var/lib/kubelet/pods/0161a078-8da8-4080-bd86-1f8adfd0b57c/volumes" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.727355 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9"] Mar 10 00:18:52 crc kubenswrapper[4906]: E0310 00:18:52.728225 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0161a078-8da8-4080-bd86-1f8adfd0b57c" containerName="extract-utilities" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.728248 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="0161a078-8da8-4080-bd86-1f8adfd0b57c" containerName="extract-utilities" Mar 10 00:18:52 crc kubenswrapper[4906]: E0310 00:18:52.728275 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3f8803e-330e-4afb-af29-1c613251cd1c" containerName="oc" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.728287 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f8803e-330e-4afb-af29-1c613251cd1c" containerName="oc" Mar 10 00:18:52 crc kubenswrapper[4906]: E0310 00:18:52.728306 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0161a078-8da8-4080-bd86-1f8adfd0b57c" containerName="registry-server" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.728322 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="0161a078-8da8-4080-bd86-1f8adfd0b57c" containerName="registry-server" Mar 10 00:18:52 crc kubenswrapper[4906]: E0310 00:18:52.728342 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0161a078-8da8-4080-bd86-1f8adfd0b57c" containerName="extract-content" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.728354 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="0161a078-8da8-4080-bd86-1f8adfd0b57c" containerName="extract-content" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.728521 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="0161a078-8da8-4080-bd86-1f8adfd0b57c" containerName="registry-server" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.728544 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3f8803e-330e-4afb-af29-1c613251cd1c" containerName="oc" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.730033 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.733759 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.740302 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9"] Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.760115 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46nvp\" (UniqueName: \"kubernetes.io/projected/a7ead5da-0c4e-4774-b55c-fcacd2943780-kube-api-access-46nvp\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9\" (UID: \"a7ead5da-0c4e-4774-b55c-fcacd2943780\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.760185 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7ead5da-0c4e-4774-b55c-fcacd2943780-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9\" (UID: \"a7ead5da-0c4e-4774-b55c-fcacd2943780\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.760255 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7ead5da-0c4e-4774-b55c-fcacd2943780-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9\" (UID: \"a7ead5da-0c4e-4774-b55c-fcacd2943780\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.861394 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46nvp\" (UniqueName: \"kubernetes.io/projected/a7ead5da-0c4e-4774-b55c-fcacd2943780-kube-api-access-46nvp\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9\" (UID: \"a7ead5da-0c4e-4774-b55c-fcacd2943780\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.861480 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7ead5da-0c4e-4774-b55c-fcacd2943780-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9\" (UID: \"a7ead5da-0c4e-4774-b55c-fcacd2943780\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.861549 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7ead5da-0c4e-4774-b55c-fcacd2943780-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9\" (UID: \"a7ead5da-0c4e-4774-b55c-fcacd2943780\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.862260 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7ead5da-0c4e-4774-b55c-fcacd2943780-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9\" (UID: \"a7ead5da-0c4e-4774-b55c-fcacd2943780\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.862467 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7ead5da-0c4e-4774-b55c-fcacd2943780-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9\" (UID: \"a7ead5da-0c4e-4774-b55c-fcacd2943780\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" Mar 10 00:18:52 crc kubenswrapper[4906]: I0310 00:18:52.894512 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46nvp\" (UniqueName: \"kubernetes.io/projected/a7ead5da-0c4e-4774-b55c-fcacd2943780-kube-api-access-46nvp\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9\" (UID: \"a7ead5da-0c4e-4774-b55c-fcacd2943780\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" Mar 10 00:18:53 crc kubenswrapper[4906]: I0310 00:18:53.060848 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" Mar 10 00:18:53 crc kubenswrapper[4906]: I0310 00:18:53.372866 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9"] Mar 10 00:18:53 crc kubenswrapper[4906]: W0310 00:18:53.385282 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7ead5da_0c4e_4774_b55c_fcacd2943780.slice/crio-4e7142e2749c488cba768d58b76a2398231af2d5cca30b3b5d695a015e09ed2d WatchSource:0}: Error finding container 4e7142e2749c488cba768d58b76a2398231af2d5cca30b3b5d695a015e09ed2d: Status 404 returned error can't find the container with id 4e7142e2749c488cba768d58b76a2398231af2d5cca30b3b5d695a015e09ed2d Mar 10 00:18:53 crc kubenswrapper[4906]: I0310 00:18:53.624838 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" event={"ID":"a7ead5da-0c4e-4774-b55c-fcacd2943780","Type":"ContainerStarted","Data":"559c91691d8acfdc493e9839a1c8633e0798d3c7f5f6c4c79f2441bd689d7b4d"} Mar 10 00:18:53 crc kubenswrapper[4906]: I0310 00:18:53.625462 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" event={"ID":"a7ead5da-0c4e-4774-b55c-fcacd2943780","Type":"ContainerStarted","Data":"4e7142e2749c488cba768d58b76a2398231af2d5cca30b3b5d695a015e09ed2d"} Mar 10 00:18:54 crc kubenswrapper[4906]: I0310 00:18:54.634221 4906 generic.go:334] "Generic (PLEG): container finished" podID="a7ead5da-0c4e-4774-b55c-fcacd2943780" containerID="559c91691d8acfdc493e9839a1c8633e0798d3c7f5f6c4c79f2441bd689d7b4d" exitCode=0 Mar 10 00:18:54 crc kubenswrapper[4906]: I0310 00:18:54.638140 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" event={"ID":"a7ead5da-0c4e-4774-b55c-fcacd2943780","Type":"ContainerDied","Data":"559c91691d8acfdc493e9839a1c8633e0798d3c7f5f6c4c79f2441bd689d7b4d"} Mar 10 00:18:56 crc kubenswrapper[4906]: I0310 00:18:56.658179 4906 generic.go:334] "Generic (PLEG): container finished" podID="a7ead5da-0c4e-4774-b55c-fcacd2943780" containerID="aa2def8638bf518e82355aec8a2e0b899d942daa6b85ae44abefeb8743844b32" exitCode=0 Mar 10 00:18:56 crc kubenswrapper[4906]: I0310 00:18:56.658296 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" event={"ID":"a7ead5da-0c4e-4774-b55c-fcacd2943780","Type":"ContainerDied","Data":"aa2def8638bf518e82355aec8a2e0b899d942daa6b85ae44abefeb8743844b32"} Mar 10 00:18:57 crc kubenswrapper[4906]: I0310 00:18:57.673296 4906 generic.go:334] "Generic (PLEG): container finished" podID="a7ead5da-0c4e-4774-b55c-fcacd2943780" containerID="a0ba9bb608890b132da00f2e2bb264161df56e01ae0fbc4c9065fa6bdc8e8ba8" exitCode=0 Mar 10 00:18:57 crc kubenswrapper[4906]: I0310 00:18:57.673564 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" event={"ID":"a7ead5da-0c4e-4774-b55c-fcacd2943780","Type":"ContainerDied","Data":"a0ba9bb608890b132da00f2e2bb264161df56e01ae0fbc4c9065fa6bdc8e8ba8"} Mar 10 00:18:58 crc kubenswrapper[4906]: I0310 00:18:58.994982 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.152562 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7ead5da-0c4e-4774-b55c-fcacd2943780-util\") pod \"a7ead5da-0c4e-4774-b55c-fcacd2943780\" (UID: \"a7ead5da-0c4e-4774-b55c-fcacd2943780\") " Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.153723 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7ead5da-0c4e-4774-b55c-fcacd2943780-bundle\") pod \"a7ead5da-0c4e-4774-b55c-fcacd2943780\" (UID: \"a7ead5da-0c4e-4774-b55c-fcacd2943780\") " Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.153800 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46nvp\" (UniqueName: \"kubernetes.io/projected/a7ead5da-0c4e-4774-b55c-fcacd2943780-kube-api-access-46nvp\") pod \"a7ead5da-0c4e-4774-b55c-fcacd2943780\" (UID: \"a7ead5da-0c4e-4774-b55c-fcacd2943780\") " Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.157984 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7ead5da-0c4e-4774-b55c-fcacd2943780-bundle" (OuterVolumeSpecName: "bundle") pod "a7ead5da-0c4e-4774-b55c-fcacd2943780" (UID: "a7ead5da-0c4e-4774-b55c-fcacd2943780"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.163984 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7ead5da-0c4e-4774-b55c-fcacd2943780-kube-api-access-46nvp" (OuterVolumeSpecName: "kube-api-access-46nvp") pod "a7ead5da-0c4e-4774-b55c-fcacd2943780" (UID: "a7ead5da-0c4e-4774-b55c-fcacd2943780"). InnerVolumeSpecName "kube-api-access-46nvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.175939 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7ead5da-0c4e-4774-b55c-fcacd2943780-util" (OuterVolumeSpecName: "util") pod "a7ead5da-0c4e-4774-b55c-fcacd2943780" (UID: "a7ead5da-0c4e-4774-b55c-fcacd2943780"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.255210 4906 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a7ead5da-0c4e-4774-b55c-fcacd2943780-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.255251 4906 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a7ead5da-0c4e-4774-b55c-fcacd2943780-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.255263 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46nvp\" (UniqueName: \"kubernetes.io/projected/a7ead5da-0c4e-4774-b55c-fcacd2943780-kube-api-access-46nvp\") on node \"crc\" DevicePath \"\"" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.289204 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7"] Mar 10 00:18:59 crc kubenswrapper[4906]: E0310 00:18:59.289441 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ead5da-0c4e-4774-b55c-fcacd2943780" containerName="extract" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.289456 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ead5da-0c4e-4774-b55c-fcacd2943780" containerName="extract" Mar 10 00:18:59 crc kubenswrapper[4906]: E0310 00:18:59.289469 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ead5da-0c4e-4774-b55c-fcacd2943780" containerName="pull" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.289475 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ead5da-0c4e-4774-b55c-fcacd2943780" containerName="pull" Mar 10 00:18:59 crc kubenswrapper[4906]: E0310 00:18:59.289482 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ead5da-0c4e-4774-b55c-fcacd2943780" containerName="util" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.289488 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ead5da-0c4e-4774-b55c-fcacd2943780" containerName="util" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.289585 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ead5da-0c4e-4774-b55c-fcacd2943780" containerName="extract" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.290310 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.301849 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7"] Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.457818 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjhgb\" (UniqueName: \"kubernetes.io/projected/9f632493-caa4-489a-8960-e6980ffd1659-kube-api-access-tjhgb\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7\" (UID: \"9f632493-caa4-489a-8960-e6980ffd1659\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.457880 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f632493-caa4-489a-8960-e6980ffd1659-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7\" (UID: \"9f632493-caa4-489a-8960-e6980ffd1659\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.457915 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f632493-caa4-489a-8960-e6980ffd1659-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7\" (UID: \"9f632493-caa4-489a-8960-e6980ffd1659\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.560196 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjhgb\" (UniqueName: \"kubernetes.io/projected/9f632493-caa4-489a-8960-e6980ffd1659-kube-api-access-tjhgb\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7\" (UID: \"9f632493-caa4-489a-8960-e6980ffd1659\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.560299 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f632493-caa4-489a-8960-e6980ffd1659-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7\" (UID: \"9f632493-caa4-489a-8960-e6980ffd1659\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.560371 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f632493-caa4-489a-8960-e6980ffd1659-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7\" (UID: \"9f632493-caa4-489a-8960-e6980ffd1659\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.561006 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f632493-caa4-489a-8960-e6980ffd1659-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7\" (UID: \"9f632493-caa4-489a-8960-e6980ffd1659\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.561072 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f632493-caa4-489a-8960-e6980ffd1659-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7\" (UID: \"9f632493-caa4-489a-8960-e6980ffd1659\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.578458 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjhgb\" (UniqueName: \"kubernetes.io/projected/9f632493-caa4-489a-8960-e6980ffd1659-kube-api-access-tjhgb\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7\" (UID: \"9f632493-caa4-489a-8960-e6980ffd1659\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.654664 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.692839 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" event={"ID":"a7ead5da-0c4e-4774-b55c-fcacd2943780","Type":"ContainerDied","Data":"4e7142e2749c488cba768d58b76a2398231af2d5cca30b3b5d695a015e09ed2d"} Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.692907 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e7142e2749c488cba768d58b76a2398231af2d5cca30b3b5d695a015e09ed2d" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.693013 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9" Mar 10 00:18:59 crc kubenswrapper[4906]: I0310 00:18:59.901408 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7"] Mar 10 00:18:59 crc kubenswrapper[4906]: W0310 00:18:59.903839 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f632493_caa4_489a_8960_e6980ffd1659.slice/crio-ce1750e47c73a94a37a7c7646ac35881116610be8667d99c7caaeed34778d8d1 WatchSource:0}: Error finding container ce1750e47c73a94a37a7c7646ac35881116610be8667d99c7caaeed34778d8d1: Status 404 returned error can't find the container with id ce1750e47c73a94a37a7c7646ac35881116610be8667d99c7caaeed34778d8d1 Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.297827 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c"] Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.299466 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.315493 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c"] Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.474351 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98165996-cb0a-4820-b1b0-e014d3362b2d-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c\" (UID: \"98165996-cb0a-4820-b1b0-e014d3362b2d\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.474806 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phfld\" (UniqueName: \"kubernetes.io/projected/98165996-cb0a-4820-b1b0-e014d3362b2d-kube-api-access-phfld\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c\" (UID: \"98165996-cb0a-4820-b1b0-e014d3362b2d\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.474992 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98165996-cb0a-4820-b1b0-e014d3362b2d-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c\" (UID: \"98165996-cb0a-4820-b1b0-e014d3362b2d\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.502438 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.502525 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.576684 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phfld\" (UniqueName: \"kubernetes.io/projected/98165996-cb0a-4820-b1b0-e014d3362b2d-kube-api-access-phfld\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c\" (UID: \"98165996-cb0a-4820-b1b0-e014d3362b2d\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.576775 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98165996-cb0a-4820-b1b0-e014d3362b2d-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c\" (UID: \"98165996-cb0a-4820-b1b0-e014d3362b2d\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.576843 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98165996-cb0a-4820-b1b0-e014d3362b2d-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c\" (UID: \"98165996-cb0a-4820-b1b0-e014d3362b2d\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.577441 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98165996-cb0a-4820-b1b0-e014d3362b2d-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c\" (UID: \"98165996-cb0a-4820-b1b0-e014d3362b2d\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.577736 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98165996-cb0a-4820-b1b0-e014d3362b2d-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c\" (UID: \"98165996-cb0a-4820-b1b0-e014d3362b2d\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.609936 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phfld\" (UniqueName: \"kubernetes.io/projected/98165996-cb0a-4820-b1b0-e014d3362b2d-kube-api-access-phfld\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c\" (UID: \"98165996-cb0a-4820-b1b0-e014d3362b2d\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.628359 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.704567 4906 generic.go:334] "Generic (PLEG): container finished" podID="9f632493-caa4-489a-8960-e6980ffd1659" containerID="1c6ee69e8750336e898e3627365e3cc5d47e9ec1c80cf1aea2678fa041f64a0e" exitCode=0 Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.704671 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" event={"ID":"9f632493-caa4-489a-8960-e6980ffd1659","Type":"ContainerDied","Data":"1c6ee69e8750336e898e3627365e3cc5d47e9ec1c80cf1aea2678fa041f64a0e"} Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.704727 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" event={"ID":"9f632493-caa4-489a-8960-e6980ffd1659","Type":"ContainerStarted","Data":"ce1750e47c73a94a37a7c7646ac35881116610be8667d99c7caaeed34778d8d1"} Mar 10 00:19:00 crc kubenswrapper[4906]: I0310 00:19:00.931490 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c"] Mar 10 00:19:01 crc kubenswrapper[4906]: I0310 00:19:01.714598 4906 generic.go:334] "Generic (PLEG): container finished" podID="9f632493-caa4-489a-8960-e6980ffd1659" containerID="44a284b024dba0c24a56413fd38f2c635d18e71362c32e95ef256a62cddbebab" exitCode=0 Mar 10 00:19:01 crc kubenswrapper[4906]: I0310 00:19:01.714728 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" event={"ID":"9f632493-caa4-489a-8960-e6980ffd1659","Type":"ContainerDied","Data":"44a284b024dba0c24a56413fd38f2c635d18e71362c32e95ef256a62cddbebab"} Mar 10 00:19:01 crc kubenswrapper[4906]: I0310 00:19:01.718578 4906 generic.go:334] "Generic (PLEG): container finished" podID="98165996-cb0a-4820-b1b0-e014d3362b2d" containerID="c4763c44075e4e34710e33cd3e1350677456e86f08620f0cd89f8bd94aa9f7bd" exitCode=0 Mar 10 00:19:01 crc kubenswrapper[4906]: I0310 00:19:01.718625 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" event={"ID":"98165996-cb0a-4820-b1b0-e014d3362b2d","Type":"ContainerDied","Data":"c4763c44075e4e34710e33cd3e1350677456e86f08620f0cd89f8bd94aa9f7bd"} Mar 10 00:19:01 crc kubenswrapper[4906]: I0310 00:19:01.718729 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" event={"ID":"98165996-cb0a-4820-b1b0-e014d3362b2d","Type":"ContainerStarted","Data":"96ee394c8037d7cfb5ad68aef149c4ed16ce70aea87fc6b1713abbf6c5ec3d47"} Mar 10 00:19:02 crc kubenswrapper[4906]: I0310 00:19:02.728031 4906 generic.go:334] "Generic (PLEG): container finished" podID="9f632493-caa4-489a-8960-e6980ffd1659" containerID="a0c0722b4c6a891bcd61feb9966c4cf41aa2383e1615bf4e5d0fc6d5d07b39af" exitCode=0 Mar 10 00:19:02 crc kubenswrapper[4906]: I0310 00:19:02.728229 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" event={"ID":"9f632493-caa4-489a-8960-e6980ffd1659","Type":"ContainerDied","Data":"a0c0722b4c6a891bcd61feb9966c4cf41aa2383e1615bf4e5d0fc6d5d07b39af"} Mar 10 00:19:03 crc kubenswrapper[4906]: I0310 00:19:03.735837 4906 generic.go:334] "Generic (PLEG): container finished" podID="98165996-cb0a-4820-b1b0-e014d3362b2d" containerID="2f8e3fb0a1113307aec555b709e9bbed571ec61ecf90d850dcaa1be1b32ff4d1" exitCode=0 Mar 10 00:19:03 crc kubenswrapper[4906]: I0310 00:19:03.736089 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" event={"ID":"98165996-cb0a-4820-b1b0-e014d3362b2d","Type":"ContainerDied","Data":"2f8e3fb0a1113307aec555b709e9bbed571ec61ecf90d850dcaa1be1b32ff4d1"} Mar 10 00:19:04 crc kubenswrapper[4906]: I0310 00:19:04.105292 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" Mar 10 00:19:04 crc kubenswrapper[4906]: I0310 00:19:04.237926 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f632493-caa4-489a-8960-e6980ffd1659-bundle\") pod \"9f632493-caa4-489a-8960-e6980ffd1659\" (UID: \"9f632493-caa4-489a-8960-e6980ffd1659\") " Mar 10 00:19:04 crc kubenswrapper[4906]: I0310 00:19:04.237977 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjhgb\" (UniqueName: \"kubernetes.io/projected/9f632493-caa4-489a-8960-e6980ffd1659-kube-api-access-tjhgb\") pod \"9f632493-caa4-489a-8960-e6980ffd1659\" (UID: \"9f632493-caa4-489a-8960-e6980ffd1659\") " Mar 10 00:19:04 crc kubenswrapper[4906]: I0310 00:19:04.238945 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f632493-caa4-489a-8960-e6980ffd1659-util\") pod \"9f632493-caa4-489a-8960-e6980ffd1659\" (UID: \"9f632493-caa4-489a-8960-e6980ffd1659\") " Mar 10 00:19:04 crc kubenswrapper[4906]: I0310 00:19:04.238961 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f632493-caa4-489a-8960-e6980ffd1659-bundle" (OuterVolumeSpecName: "bundle") pod "9f632493-caa4-489a-8960-e6980ffd1659" (UID: "9f632493-caa4-489a-8960-e6980ffd1659"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:04 crc kubenswrapper[4906]: I0310 00:19:04.239099 4906 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f632493-caa4-489a-8960-e6980ffd1659-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:04 crc kubenswrapper[4906]: I0310 00:19:04.245357 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f632493-caa4-489a-8960-e6980ffd1659-kube-api-access-tjhgb" (OuterVolumeSpecName: "kube-api-access-tjhgb") pod "9f632493-caa4-489a-8960-e6980ffd1659" (UID: "9f632493-caa4-489a-8960-e6980ffd1659"). InnerVolumeSpecName "kube-api-access-tjhgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:19:04 crc kubenswrapper[4906]: I0310 00:19:04.264787 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f632493-caa4-489a-8960-e6980ffd1659-util" (OuterVolumeSpecName: "util") pod "9f632493-caa4-489a-8960-e6980ffd1659" (UID: "9f632493-caa4-489a-8960-e6980ffd1659"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:04 crc kubenswrapper[4906]: I0310 00:19:04.339828 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjhgb\" (UniqueName: \"kubernetes.io/projected/9f632493-caa4-489a-8960-e6980ffd1659-kube-api-access-tjhgb\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:04 crc kubenswrapper[4906]: I0310 00:19:04.339867 4906 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f632493-caa4-489a-8960-e6980ffd1659-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:04 crc kubenswrapper[4906]: I0310 00:19:04.747269 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" event={"ID":"9f632493-caa4-489a-8960-e6980ffd1659","Type":"ContainerDied","Data":"ce1750e47c73a94a37a7c7646ac35881116610be8667d99c7caaeed34778d8d1"} Mar 10 00:19:04 crc kubenswrapper[4906]: I0310 00:19:04.748035 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce1750e47c73a94a37a7c7646ac35881116610be8667d99c7caaeed34778d8d1" Mar 10 00:19:04 crc kubenswrapper[4906]: I0310 00:19:04.747577 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7" Mar 10 00:19:04 crc kubenswrapper[4906]: I0310 00:19:04.750008 4906 generic.go:334] "Generic (PLEG): container finished" podID="98165996-cb0a-4820-b1b0-e014d3362b2d" containerID="3350201c2ed72c687b7147d4b16cf95d8cee9f5ad337e0bca32fb6e096e0e4c0" exitCode=0 Mar 10 00:19:04 crc kubenswrapper[4906]: I0310 00:19:04.750032 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" event={"ID":"98165996-cb0a-4820-b1b0-e014d3362b2d","Type":"ContainerDied","Data":"3350201c2ed72c687b7147d4b16cf95d8cee9f5ad337e0bca32fb6e096e0e4c0"} Mar 10 00:19:06 crc kubenswrapper[4906]: I0310 00:19:06.127476 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" Mar 10 00:19:06 crc kubenswrapper[4906]: I0310 00:19:06.266892 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98165996-cb0a-4820-b1b0-e014d3362b2d-bundle\") pod \"98165996-cb0a-4820-b1b0-e014d3362b2d\" (UID: \"98165996-cb0a-4820-b1b0-e014d3362b2d\") " Mar 10 00:19:06 crc kubenswrapper[4906]: I0310 00:19:06.267496 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phfld\" (UniqueName: \"kubernetes.io/projected/98165996-cb0a-4820-b1b0-e014d3362b2d-kube-api-access-phfld\") pod \"98165996-cb0a-4820-b1b0-e014d3362b2d\" (UID: \"98165996-cb0a-4820-b1b0-e014d3362b2d\") " Mar 10 00:19:06 crc kubenswrapper[4906]: I0310 00:19:06.267530 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98165996-cb0a-4820-b1b0-e014d3362b2d-util\") pod \"98165996-cb0a-4820-b1b0-e014d3362b2d\" (UID: \"98165996-cb0a-4820-b1b0-e014d3362b2d\") " Mar 10 00:19:06 crc kubenswrapper[4906]: I0310 00:19:06.267539 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98165996-cb0a-4820-b1b0-e014d3362b2d-bundle" (OuterVolumeSpecName: "bundle") pod "98165996-cb0a-4820-b1b0-e014d3362b2d" (UID: "98165996-cb0a-4820-b1b0-e014d3362b2d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:06 crc kubenswrapper[4906]: I0310 00:19:06.267828 4906 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/98165996-cb0a-4820-b1b0-e014d3362b2d-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:06 crc kubenswrapper[4906]: I0310 00:19:06.274209 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98165996-cb0a-4820-b1b0-e014d3362b2d-kube-api-access-phfld" (OuterVolumeSpecName: "kube-api-access-phfld") pod "98165996-cb0a-4820-b1b0-e014d3362b2d" (UID: "98165996-cb0a-4820-b1b0-e014d3362b2d"). InnerVolumeSpecName "kube-api-access-phfld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:19:06 crc kubenswrapper[4906]: I0310 00:19:06.300485 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98165996-cb0a-4820-b1b0-e014d3362b2d-util" (OuterVolumeSpecName: "util") pod "98165996-cb0a-4820-b1b0-e014d3362b2d" (UID: "98165996-cb0a-4820-b1b0-e014d3362b2d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:06 crc kubenswrapper[4906]: I0310 00:19:06.368930 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phfld\" (UniqueName: \"kubernetes.io/projected/98165996-cb0a-4820-b1b0-e014d3362b2d-kube-api-access-phfld\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:06 crc kubenswrapper[4906]: I0310 00:19:06.368974 4906 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/98165996-cb0a-4820-b1b0-e014d3362b2d-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:06 crc kubenswrapper[4906]: I0310 00:19:06.778835 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" event={"ID":"98165996-cb0a-4820-b1b0-e014d3362b2d","Type":"ContainerDied","Data":"96ee394c8037d7cfb5ad68aef149c4ed16ce70aea87fc6b1713abbf6c5ec3d47"} Mar 10 00:19:06 crc kubenswrapper[4906]: I0310 00:19:06.778883 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96ee394c8037d7cfb5ad68aef149c4ed16ce70aea87fc6b1713abbf6c5ec3d47" Mar 10 00:19:06 crc kubenswrapper[4906]: I0310 00:19:06.778954 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.505931 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq"] Mar 10 00:19:08 crc kubenswrapper[4906]: E0310 00:19:08.506478 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98165996-cb0a-4820-b1b0-e014d3362b2d" containerName="pull" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.506492 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="98165996-cb0a-4820-b1b0-e014d3362b2d" containerName="pull" Mar 10 00:19:08 crc kubenswrapper[4906]: E0310 00:19:08.506503 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98165996-cb0a-4820-b1b0-e014d3362b2d" containerName="extract" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.506508 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="98165996-cb0a-4820-b1b0-e014d3362b2d" containerName="extract" Mar 10 00:19:08 crc kubenswrapper[4906]: E0310 00:19:08.506522 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f632493-caa4-489a-8960-e6980ffd1659" containerName="util" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.506529 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f632493-caa4-489a-8960-e6980ffd1659" containerName="util" Mar 10 00:19:08 crc kubenswrapper[4906]: E0310 00:19:08.506538 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98165996-cb0a-4820-b1b0-e014d3362b2d" containerName="util" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.506544 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="98165996-cb0a-4820-b1b0-e014d3362b2d" containerName="util" Mar 10 00:19:08 crc kubenswrapper[4906]: E0310 00:19:08.506552 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f632493-caa4-489a-8960-e6980ffd1659" containerName="extract" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.506558 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f632493-caa4-489a-8960-e6980ffd1659" containerName="extract" Mar 10 00:19:08 crc kubenswrapper[4906]: E0310 00:19:08.506568 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f632493-caa4-489a-8960-e6980ffd1659" containerName="pull" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.506573 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f632493-caa4-489a-8960-e6980ffd1659" containerName="pull" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.506660 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f632493-caa4-489a-8960-e6980ffd1659" containerName="extract" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.506691 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="98165996-cb0a-4820-b1b0-e014d3362b2d" containerName="extract" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.507445 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.510358 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.518318 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq"] Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.599155 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79926529-086e-4612-ac41-ad0e16ac2a4d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq\" (UID: \"79926529-086e-4612-ac41-ad0e16ac2a4d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.599302 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg9zb\" (UniqueName: \"kubernetes.io/projected/79926529-086e-4612-ac41-ad0e16ac2a4d-kube-api-access-xg9zb\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq\" (UID: \"79926529-086e-4612-ac41-ad0e16ac2a4d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.599587 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79926529-086e-4612-ac41-ad0e16ac2a4d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq\" (UID: \"79926529-086e-4612-ac41-ad0e16ac2a4d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.701158 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79926529-086e-4612-ac41-ad0e16ac2a4d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq\" (UID: \"79926529-086e-4612-ac41-ad0e16ac2a4d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.701234 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg9zb\" (UniqueName: \"kubernetes.io/projected/79926529-086e-4612-ac41-ad0e16ac2a4d-kube-api-access-xg9zb\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq\" (UID: \"79926529-086e-4612-ac41-ad0e16ac2a4d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.701269 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79926529-086e-4612-ac41-ad0e16ac2a4d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq\" (UID: \"79926529-086e-4612-ac41-ad0e16ac2a4d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.702056 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79926529-086e-4612-ac41-ad0e16ac2a4d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq\" (UID: \"79926529-086e-4612-ac41-ad0e16ac2a4d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.702070 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79926529-086e-4612-ac41-ad0e16ac2a4d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq\" (UID: \"79926529-086e-4612-ac41-ad0e16ac2a4d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.727848 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg9zb\" (UniqueName: \"kubernetes.io/projected/79926529-086e-4612-ac41-ad0e16ac2a4d-kube-api-access-xg9zb\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq\" (UID: \"79926529-086e-4612-ac41-ad0e16ac2a4d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" Mar 10 00:19:08 crc kubenswrapper[4906]: I0310 00:19:08.822057 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" Mar 10 00:19:09 crc kubenswrapper[4906]: I0310 00:19:09.088729 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq"] Mar 10 00:19:09 crc kubenswrapper[4906]: W0310 00:19:09.101863 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79926529_086e_4612_ac41_ad0e16ac2a4d.slice/crio-86a8aef5d0126afaa08ce5910ad97b493deff24e79508f0db84479e90787dae9 WatchSource:0}: Error finding container 86a8aef5d0126afaa08ce5910ad97b493deff24e79508f0db84479e90787dae9: Status 404 returned error can't find the container with id 86a8aef5d0126afaa08ce5910ad97b493deff24e79508f0db84479e90787dae9 Mar 10 00:19:09 crc kubenswrapper[4906]: I0310 00:19:09.799578 4906 generic.go:334] "Generic (PLEG): container finished" podID="79926529-086e-4612-ac41-ad0e16ac2a4d" containerID="52fdabb60f8c10ff30867ebac9e239034562bb9cde3a584720f9fc8d4379b3b3" exitCode=0 Mar 10 00:19:09 crc kubenswrapper[4906]: I0310 00:19:09.799674 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" event={"ID":"79926529-086e-4612-ac41-ad0e16ac2a4d","Type":"ContainerDied","Data":"52fdabb60f8c10ff30867ebac9e239034562bb9cde3a584720f9fc8d4379b3b3"} Mar 10 00:19:09 crc kubenswrapper[4906]: I0310 00:19:09.800014 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" event={"ID":"79926529-086e-4612-ac41-ad0e16ac2a4d","Type":"ContainerStarted","Data":"86a8aef5d0126afaa08ce5910ad97b493deff24e79508f0db84479e90787dae9"} Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.021643 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-qm858"] Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.022365 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qm858" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.024494 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.025116 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.026052 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-5224t" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.046036 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-qm858"] Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.122597 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2mxq\" (UniqueName: \"kubernetes.io/projected/a13dbb4b-4d05-4b5a-9155-863ae39b84d7-kube-api-access-d2mxq\") pod \"obo-prometheus-operator-68bc856cb9-qm858\" (UID: \"a13dbb4b-4d05-4b5a-9155-863ae39b84d7\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qm858" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.155723 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr"] Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.156452 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.158432 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.158521 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-b45ql" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.165299 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd"] Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.166083 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.186107 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd"] Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.224697 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2mxq\" (UniqueName: \"kubernetes.io/projected/a13dbb4b-4d05-4b5a-9155-863ae39b84d7-kube-api-access-d2mxq\") pod \"obo-prometheus-operator-68bc856cb9-qm858\" (UID: \"a13dbb4b-4d05-4b5a-9155-863ae39b84d7\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qm858" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.241874 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr"] Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.254465 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2mxq\" (UniqueName: \"kubernetes.io/projected/a13dbb4b-4d05-4b5a-9155-863ae39b84d7-kube-api-access-d2mxq\") pod \"obo-prometheus-operator-68bc856cb9-qm858\" (UID: \"a13dbb4b-4d05-4b5a-9155-863ae39b84d7\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qm858" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.325690 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6604103a-2326-470c-b01f-a99fea58b571-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr\" (UID: \"6604103a-2326-470c-b01f-a99fea58b571\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.325774 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6f2dab40-1946-479a-9741-637188d9fe10-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd\" (UID: \"6f2dab40-1946-479a-9741-637188d9fe10\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.325803 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6f2dab40-1946-479a-9741-637188d9fe10-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd\" (UID: \"6f2dab40-1946-479a-9741-637188d9fe10\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.325820 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6604103a-2326-470c-b01f-a99fea58b571-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr\" (UID: \"6604103a-2326-470c-b01f-a99fea58b571\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.338367 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qm858" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.369533 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-6w9dp"] Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.370953 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-6w9dp" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.394410 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.394518 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-7cxjj" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.401695 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-6w9dp"] Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.427857 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6604103a-2326-470c-b01f-a99fea58b571-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr\" (UID: \"6604103a-2326-470c-b01f-a99fea58b571\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.427949 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6f2dab40-1946-479a-9741-637188d9fe10-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd\" (UID: \"6f2dab40-1946-479a-9741-637188d9fe10\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.428033 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6604103a-2326-470c-b01f-a99fea58b571-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr\" (UID: \"6604103a-2326-470c-b01f-a99fea58b571\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.428053 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6f2dab40-1946-479a-9741-637188d9fe10-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd\" (UID: \"6f2dab40-1946-479a-9741-637188d9fe10\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.434271 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6604103a-2326-470c-b01f-a99fea58b571-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr\" (UID: \"6604103a-2326-470c-b01f-a99fea58b571\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.434755 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6f2dab40-1946-479a-9741-637188d9fe10-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd\" (UID: \"6f2dab40-1946-479a-9741-637188d9fe10\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.435069 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6f2dab40-1946-479a-9741-637188d9fe10-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd\" (UID: \"6f2dab40-1946-479a-9741-637188d9fe10\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.437865 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6604103a-2326-470c-b01f-a99fea58b571-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr\" (UID: \"6604103a-2326-470c-b01f-a99fea58b571\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.473520 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.482295 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.528954 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba8e917b-ba75-43ca-ba9e-d52c72031402-observability-operator-tls\") pod \"observability-operator-59bdc8b94-6w9dp\" (UID: \"ba8e917b-ba75-43ca-ba9e-d52c72031402\") " pod="openshift-operators/observability-operator-59bdc8b94-6w9dp" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.529033 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxsmm\" (UniqueName: \"kubernetes.io/projected/ba8e917b-ba75-43ca-ba9e-d52c72031402-kube-api-access-kxsmm\") pod \"observability-operator-59bdc8b94-6w9dp\" (UID: \"ba8e917b-ba75-43ca-ba9e-d52c72031402\") " pod="openshift-operators/observability-operator-59bdc8b94-6w9dp" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.586487 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-4gjdk"] Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.589539 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-4gjdk"] Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.589649 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4gjdk" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.595224 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-zmh9p" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.641986 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba8e917b-ba75-43ca-ba9e-d52c72031402-observability-operator-tls\") pod \"observability-operator-59bdc8b94-6w9dp\" (UID: \"ba8e917b-ba75-43ca-ba9e-d52c72031402\") " pod="openshift-operators/observability-operator-59bdc8b94-6w9dp" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.642071 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxsmm\" (UniqueName: \"kubernetes.io/projected/ba8e917b-ba75-43ca-ba9e-d52c72031402-kube-api-access-kxsmm\") pod \"observability-operator-59bdc8b94-6w9dp\" (UID: \"ba8e917b-ba75-43ca-ba9e-d52c72031402\") " pod="openshift-operators/observability-operator-59bdc8b94-6w9dp" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.649804 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba8e917b-ba75-43ca-ba9e-d52c72031402-observability-operator-tls\") pod \"observability-operator-59bdc8b94-6w9dp\" (UID: \"ba8e917b-ba75-43ca-ba9e-d52c72031402\") " pod="openshift-operators/observability-operator-59bdc8b94-6w9dp" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.670348 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxsmm\" (UniqueName: \"kubernetes.io/projected/ba8e917b-ba75-43ca-ba9e-d52c72031402-kube-api-access-kxsmm\") pod \"observability-operator-59bdc8b94-6w9dp\" (UID: \"ba8e917b-ba75-43ca-ba9e-d52c72031402\") " pod="openshift-operators/observability-operator-59bdc8b94-6w9dp" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.723017 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-6w9dp" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.743568 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q22kg\" (UniqueName: \"kubernetes.io/projected/05f0097c-957a-4679-bfde-f6fd3bb64a31-kube-api-access-q22kg\") pod \"perses-operator-5bf474d74f-4gjdk\" (UID: \"05f0097c-957a-4679-bfde-f6fd3bb64a31\") " pod="openshift-operators/perses-operator-5bf474d74f-4gjdk" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.744091 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/05f0097c-957a-4679-bfde-f6fd3bb64a31-openshift-service-ca\") pod \"perses-operator-5bf474d74f-4gjdk\" (UID: \"05f0097c-957a-4679-bfde-f6fd3bb64a31\") " pod="openshift-operators/perses-operator-5bf474d74f-4gjdk" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.845235 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q22kg\" (UniqueName: \"kubernetes.io/projected/05f0097c-957a-4679-bfde-f6fd3bb64a31-kube-api-access-q22kg\") pod \"perses-operator-5bf474d74f-4gjdk\" (UID: \"05f0097c-957a-4679-bfde-f6fd3bb64a31\") " pod="openshift-operators/perses-operator-5bf474d74f-4gjdk" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.845313 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/05f0097c-957a-4679-bfde-f6fd3bb64a31-openshift-service-ca\") pod \"perses-operator-5bf474d74f-4gjdk\" (UID: \"05f0097c-957a-4679-bfde-f6fd3bb64a31\") " pod="openshift-operators/perses-operator-5bf474d74f-4gjdk" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.846752 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/05f0097c-957a-4679-bfde-f6fd3bb64a31-openshift-service-ca\") pod \"perses-operator-5bf474d74f-4gjdk\" (UID: \"05f0097c-957a-4679-bfde-f6fd3bb64a31\") " pod="openshift-operators/perses-operator-5bf474d74f-4gjdk" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.870183 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-qm858"] Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.877213 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q22kg\" (UniqueName: \"kubernetes.io/projected/05f0097c-957a-4679-bfde-f6fd3bb64a31-kube-api-access-q22kg\") pod \"perses-operator-5bf474d74f-4gjdk\" (UID: \"05f0097c-957a-4679-bfde-f6fd3bb64a31\") " pod="openshift-operators/perses-operator-5bf474d74f-4gjdk" Mar 10 00:19:10 crc kubenswrapper[4906]: I0310 00:19:10.915990 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4gjdk" Mar 10 00:19:11 crc kubenswrapper[4906]: I0310 00:19:11.046684 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd"] Mar 10 00:19:11 crc kubenswrapper[4906]: I0310 00:19:11.059889 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-6w9dp"] Mar 10 00:19:11 crc kubenswrapper[4906]: I0310 00:19:11.152843 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr"] Mar 10 00:19:11 crc kubenswrapper[4906]: I0310 00:19:11.357729 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-4gjdk"] Mar 10 00:19:11 crc kubenswrapper[4906]: W0310 00:19:11.372181 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05f0097c_957a_4679_bfde_f6fd3bb64a31.slice/crio-09650642f020d56cc518760608b0f8db95f4029fafa888e8b20528875315f240 WatchSource:0}: Error finding container 09650642f020d56cc518760608b0f8db95f4029fafa888e8b20528875315f240: Status 404 returned error can't find the container with id 09650642f020d56cc518760608b0f8db95f4029fafa888e8b20528875315f240 Mar 10 00:19:11 crc kubenswrapper[4906]: I0310 00:19:11.838326 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-4gjdk" event={"ID":"05f0097c-957a-4679-bfde-f6fd3bb64a31","Type":"ContainerStarted","Data":"09650642f020d56cc518760608b0f8db95f4029fafa888e8b20528875315f240"} Mar 10 00:19:11 crc kubenswrapper[4906]: I0310 00:19:11.840933 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr" event={"ID":"6604103a-2326-470c-b01f-a99fea58b571","Type":"ContainerStarted","Data":"c7a84c27d101b65ec6f1fcd6f5b4c59774ce385d0c3e25fca91fb86e18eb4a22"} Mar 10 00:19:11 crc kubenswrapper[4906]: I0310 00:19:11.856670 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-6w9dp" event={"ID":"ba8e917b-ba75-43ca-ba9e-d52c72031402","Type":"ContainerStarted","Data":"c0d64f18bea6f7de4c711fdc077f0cfe48f0194b0b94a2150cf4973e32f8b4c5"} Mar 10 00:19:11 crc kubenswrapper[4906]: I0310 00:19:11.862009 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qm858" event={"ID":"a13dbb4b-4d05-4b5a-9155-863ae39b84d7","Type":"ContainerStarted","Data":"7fe759f4e90b13e3b7338cf468813ff1495fb653f6b523286154a69c39ea7508"} Mar 10 00:19:11 crc kubenswrapper[4906]: I0310 00:19:11.863254 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd" event={"ID":"6f2dab40-1946-479a-9741-637188d9fe10","Type":"ContainerStarted","Data":"4e83920efd160d48f12cb5020b6d08b92591a1bdd49fe0bb49c436dd8d33b264"} Mar 10 00:19:15 crc kubenswrapper[4906]: I0310 00:19:15.319443 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-rqjgt"] Mar 10 00:19:15 crc kubenswrapper[4906]: I0310 00:19:15.320721 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-rqjgt" Mar 10 00:19:15 crc kubenswrapper[4906]: I0310 00:19:15.323449 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-29jnd" Mar 10 00:19:15 crc kubenswrapper[4906]: I0310 00:19:15.324083 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Mar 10 00:19:15 crc kubenswrapper[4906]: I0310 00:19:15.325151 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Mar 10 00:19:15 crc kubenswrapper[4906]: I0310 00:19:15.330175 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-rqjgt"] Mar 10 00:19:15 crc kubenswrapper[4906]: I0310 00:19:15.428426 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v79lv\" (UniqueName: \"kubernetes.io/projected/8a817d09-73d0-4254-b799-e7f6bdc7d1bc-kube-api-access-v79lv\") pod \"interconnect-operator-5bb49f789d-rqjgt\" (UID: \"8a817d09-73d0-4254-b799-e7f6bdc7d1bc\") " pod="service-telemetry/interconnect-operator-5bb49f789d-rqjgt" Mar 10 00:19:15 crc kubenswrapper[4906]: I0310 00:19:15.532330 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v79lv\" (UniqueName: \"kubernetes.io/projected/8a817d09-73d0-4254-b799-e7f6bdc7d1bc-kube-api-access-v79lv\") pod \"interconnect-operator-5bb49f789d-rqjgt\" (UID: \"8a817d09-73d0-4254-b799-e7f6bdc7d1bc\") " pod="service-telemetry/interconnect-operator-5bb49f789d-rqjgt" Mar 10 00:19:15 crc kubenswrapper[4906]: I0310 00:19:15.566861 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v79lv\" (UniqueName: \"kubernetes.io/projected/8a817d09-73d0-4254-b799-e7f6bdc7d1bc-kube-api-access-v79lv\") pod \"interconnect-operator-5bb49f789d-rqjgt\" (UID: \"8a817d09-73d0-4254-b799-e7f6bdc7d1bc\") " pod="service-telemetry/interconnect-operator-5bb49f789d-rqjgt" Mar 10 00:19:15 crc kubenswrapper[4906]: I0310 00:19:15.661061 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-rqjgt" Mar 10 00:19:18 crc kubenswrapper[4906]: I0310 00:19:18.908904 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-5557947dc7-2qnwb"] Mar 10 00:19:18 crc kubenswrapper[4906]: I0310 00:19:18.910035 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-5557947dc7-2qnwb" Mar 10 00:19:18 crc kubenswrapper[4906]: I0310 00:19:18.912718 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-98cqr" Mar 10 00:19:18 crc kubenswrapper[4906]: I0310 00:19:18.912908 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Mar 10 00:19:18 crc kubenswrapper[4906]: I0310 00:19:18.933304 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-5557947dc7-2qnwb"] Mar 10 00:19:19 crc kubenswrapper[4906]: I0310 00:19:19.003685 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crgjf\" (UniqueName: \"kubernetes.io/projected/fe3dde6b-0c57-436b-a07d-9d976199d739-kube-api-access-crgjf\") pod \"elastic-operator-5557947dc7-2qnwb\" (UID: \"fe3dde6b-0c57-436b-a07d-9d976199d739\") " pod="service-telemetry/elastic-operator-5557947dc7-2qnwb" Mar 10 00:19:19 crc kubenswrapper[4906]: I0310 00:19:19.003741 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe3dde6b-0c57-436b-a07d-9d976199d739-apiservice-cert\") pod \"elastic-operator-5557947dc7-2qnwb\" (UID: \"fe3dde6b-0c57-436b-a07d-9d976199d739\") " pod="service-telemetry/elastic-operator-5557947dc7-2qnwb" Mar 10 00:19:19 crc kubenswrapper[4906]: I0310 00:19:19.003761 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe3dde6b-0c57-436b-a07d-9d976199d739-webhook-cert\") pod \"elastic-operator-5557947dc7-2qnwb\" (UID: \"fe3dde6b-0c57-436b-a07d-9d976199d739\") " pod="service-telemetry/elastic-operator-5557947dc7-2qnwb" Mar 10 00:19:19 crc kubenswrapper[4906]: I0310 00:19:19.105789 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crgjf\" (UniqueName: \"kubernetes.io/projected/fe3dde6b-0c57-436b-a07d-9d976199d739-kube-api-access-crgjf\") pod \"elastic-operator-5557947dc7-2qnwb\" (UID: \"fe3dde6b-0c57-436b-a07d-9d976199d739\") " pod="service-telemetry/elastic-operator-5557947dc7-2qnwb" Mar 10 00:19:19 crc kubenswrapper[4906]: I0310 00:19:19.106239 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe3dde6b-0c57-436b-a07d-9d976199d739-apiservice-cert\") pod \"elastic-operator-5557947dc7-2qnwb\" (UID: \"fe3dde6b-0c57-436b-a07d-9d976199d739\") " pod="service-telemetry/elastic-operator-5557947dc7-2qnwb" Mar 10 00:19:19 crc kubenswrapper[4906]: I0310 00:19:19.106263 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe3dde6b-0c57-436b-a07d-9d976199d739-webhook-cert\") pod \"elastic-operator-5557947dc7-2qnwb\" (UID: \"fe3dde6b-0c57-436b-a07d-9d976199d739\") " pod="service-telemetry/elastic-operator-5557947dc7-2qnwb" Mar 10 00:19:19 crc kubenswrapper[4906]: I0310 00:19:19.116524 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe3dde6b-0c57-436b-a07d-9d976199d739-apiservice-cert\") pod \"elastic-operator-5557947dc7-2qnwb\" (UID: \"fe3dde6b-0c57-436b-a07d-9d976199d739\") " pod="service-telemetry/elastic-operator-5557947dc7-2qnwb" Mar 10 00:19:19 crc kubenswrapper[4906]: I0310 00:19:19.117574 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe3dde6b-0c57-436b-a07d-9d976199d739-webhook-cert\") pod \"elastic-operator-5557947dc7-2qnwb\" (UID: \"fe3dde6b-0c57-436b-a07d-9d976199d739\") " pod="service-telemetry/elastic-operator-5557947dc7-2qnwb" Mar 10 00:19:19 crc kubenswrapper[4906]: I0310 00:19:19.136154 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crgjf\" (UniqueName: \"kubernetes.io/projected/fe3dde6b-0c57-436b-a07d-9d976199d739-kube-api-access-crgjf\") pod \"elastic-operator-5557947dc7-2qnwb\" (UID: \"fe3dde6b-0c57-436b-a07d-9d976199d739\") " pod="service-telemetry/elastic-operator-5557947dc7-2qnwb" Mar 10 00:19:19 crc kubenswrapper[4906]: I0310 00:19:19.240002 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-5557947dc7-2qnwb" Mar 10 00:19:24 crc kubenswrapper[4906]: I0310 00:19:24.086141 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-rqjgt"] Mar 10 00:19:24 crc kubenswrapper[4906]: I0310 00:19:24.440162 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-5557947dc7-2qnwb"] Mar 10 00:19:24 crc kubenswrapper[4906]: W0310 00:19:24.456923 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe3dde6b_0c57_436b_a07d_9d976199d739.slice/crio-7e5b60ed19cda3846c1e6605676a7729c088f98ee2f9f35ba2b37551e7929844 WatchSource:0}: Error finding container 7e5b60ed19cda3846c1e6605676a7729c088f98ee2f9f35ba2b37551e7929844: Status 404 returned error can't find the container with id 7e5b60ed19cda3846c1e6605676a7729c088f98ee2f9f35ba2b37551e7929844 Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.019950 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-5557947dc7-2qnwb" event={"ID":"fe3dde6b-0c57-436b-a07d-9d976199d739","Type":"ContainerStarted","Data":"7e5b60ed19cda3846c1e6605676a7729c088f98ee2f9f35ba2b37551e7929844"} Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.021594 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-rqjgt" event={"ID":"8a817d09-73d0-4254-b799-e7f6bdc7d1bc","Type":"ContainerStarted","Data":"4101909002a46a2aec68bb9bfd52fe56ac33f1e0f0c240ec82079d8bd7e139a2"} Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.039307 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd" event={"ID":"6f2dab40-1946-479a-9741-637188d9fe10","Type":"ContainerStarted","Data":"0143af531a3ffd422a1506182d5d906602a59f4eceac516f3050687cf8ead2e1"} Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.041158 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-4gjdk" event={"ID":"05f0097c-957a-4679-bfde-f6fd3bb64a31","Type":"ContainerStarted","Data":"0ec4b2876f8387cefbde9ec7c1f2ca2d56776f263dbd8a2785cbd60f88b064d8"} Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.042245 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-4gjdk" Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.047722 4906 generic.go:334] "Generic (PLEG): container finished" podID="79926529-086e-4612-ac41-ad0e16ac2a4d" containerID="d7bb48c9a24947b17fb309f83f41ef6ee8e7b2391327c10562e7da55689665f0" exitCode=0 Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.047779 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" event={"ID":"79926529-086e-4612-ac41-ad0e16ac2a4d","Type":"ContainerDied","Data":"d7bb48c9a24947b17fb309f83f41ef6ee8e7b2391327c10562e7da55689665f0"} Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.049828 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr" event={"ID":"6604103a-2326-470c-b01f-a99fea58b571","Type":"ContainerStarted","Data":"39421e9ab11e99d52ce28fe82ef720980d47af5c0f5b4edab7549fe837cc500f"} Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.051438 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-6w9dp" event={"ID":"ba8e917b-ba75-43ca-ba9e-d52c72031402","Type":"ContainerStarted","Data":"c21de5d4a068f8ab2c9ecb2fef6622853a9a02cdf67cc7424e2e51c901e67195"} Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.051828 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-6w9dp" Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.054203 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qm858" event={"ID":"a13dbb4b-4d05-4b5a-9155-863ae39b84d7","Type":"ContainerStarted","Data":"46c7f513933c013a66d1e0def6a0b449afe2ab057aacf737610bca99353441c8"} Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.076633 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd" podStartSLOduration=2.2704598049999998 podStartE2EDuration="15.076605799s" podCreationTimestamp="2026-03-10 00:19:10 +0000 UTC" firstStartedPulling="2026-03-10 00:19:11.086962033 +0000 UTC m=+777.234857145" lastFinishedPulling="2026-03-10 00:19:23.893108027 +0000 UTC m=+790.041003139" observedRunningTime="2026-03-10 00:19:25.06988485 +0000 UTC m=+791.217779962" watchObservedRunningTime="2026-03-10 00:19:25.076605799 +0000 UTC m=+791.224500911" Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.082010 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-6w9dp" Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.091816 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-6w9dp" podStartSLOduration=2.289542572 podStartE2EDuration="15.091794236s" podCreationTimestamp="2026-03-10 00:19:10 +0000 UTC" firstStartedPulling="2026-03-10 00:19:11.090883253 +0000 UTC m=+777.238778365" lastFinishedPulling="2026-03-10 00:19:23.893134917 +0000 UTC m=+790.041030029" observedRunningTime="2026-03-10 00:19:25.086451156 +0000 UTC m=+791.234346268" watchObservedRunningTime="2026-03-10 00:19:25.091794236 +0000 UTC m=+791.239689338" Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.146497 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr" podStartSLOduration=2.447384904 podStartE2EDuration="15.146469005s" podCreationTimestamp="2026-03-10 00:19:10 +0000 UTC" firstStartedPulling="2026-03-10 00:19:11.194446828 +0000 UTC m=+777.342341940" lastFinishedPulling="2026-03-10 00:19:23.893530929 +0000 UTC m=+790.041426041" observedRunningTime="2026-03-10 00:19:25.140421205 +0000 UTC m=+791.288316337" watchObservedRunningTime="2026-03-10 00:19:25.146469005 +0000 UTC m=+791.294364117" Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.177067 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-4gjdk" podStartSLOduration=2.707440463 podStartE2EDuration="15.177020795s" podCreationTimestamp="2026-03-10 00:19:10 +0000 UTC" firstStartedPulling="2026-03-10 00:19:11.374540066 +0000 UTC m=+777.522435178" lastFinishedPulling="2026-03-10 00:19:23.844120398 +0000 UTC m=+789.992015510" observedRunningTime="2026-03-10 00:19:25.165943324 +0000 UTC m=+791.313838436" watchObservedRunningTime="2026-03-10 00:19:25.177020795 +0000 UTC m=+791.324915927" Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.200572 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-qm858" podStartSLOduration=2.206710651 podStartE2EDuration="15.200546198s" podCreationTimestamp="2026-03-10 00:19:10 +0000 UTC" firstStartedPulling="2026-03-10 00:19:10.894863666 +0000 UTC m=+777.042758778" lastFinishedPulling="2026-03-10 00:19:23.888699213 +0000 UTC m=+790.036594325" observedRunningTime="2026-03-10 00:19:25.198378056 +0000 UTC m=+791.346273168" watchObservedRunningTime="2026-03-10 00:19:25.200546198 +0000 UTC m=+791.348441310" Mar 10 00:19:25 crc kubenswrapper[4906]: I0310 00:19:25.266182 4906 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 00:19:26 crc kubenswrapper[4906]: I0310 00:19:26.063859 4906 generic.go:334] "Generic (PLEG): container finished" podID="79926529-086e-4612-ac41-ad0e16ac2a4d" containerID="0ba80f24b8b04a5899e95f368006abcd9b836725a41aac3d9fc06737d5db00a6" exitCode=0 Mar 10 00:19:26 crc kubenswrapper[4906]: I0310 00:19:26.064162 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" event={"ID":"79926529-086e-4612-ac41-ad0e16ac2a4d","Type":"ContainerDied","Data":"0ba80f24b8b04a5899e95f368006abcd9b836725a41aac3d9fc06737d5db00a6"} Mar 10 00:19:27 crc kubenswrapper[4906]: I0310 00:19:27.428361 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" Mar 10 00:19:27 crc kubenswrapper[4906]: I0310 00:19:27.558219 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79926529-086e-4612-ac41-ad0e16ac2a4d-bundle\") pod \"79926529-086e-4612-ac41-ad0e16ac2a4d\" (UID: \"79926529-086e-4612-ac41-ad0e16ac2a4d\") " Mar 10 00:19:27 crc kubenswrapper[4906]: I0310 00:19:27.558347 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg9zb\" (UniqueName: \"kubernetes.io/projected/79926529-086e-4612-ac41-ad0e16ac2a4d-kube-api-access-xg9zb\") pod \"79926529-086e-4612-ac41-ad0e16ac2a4d\" (UID: \"79926529-086e-4612-ac41-ad0e16ac2a4d\") " Mar 10 00:19:27 crc kubenswrapper[4906]: I0310 00:19:27.558502 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79926529-086e-4612-ac41-ad0e16ac2a4d-util\") pod \"79926529-086e-4612-ac41-ad0e16ac2a4d\" (UID: \"79926529-086e-4612-ac41-ad0e16ac2a4d\") " Mar 10 00:19:27 crc kubenswrapper[4906]: I0310 00:19:27.560365 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79926529-086e-4612-ac41-ad0e16ac2a4d-bundle" (OuterVolumeSpecName: "bundle") pod "79926529-086e-4612-ac41-ad0e16ac2a4d" (UID: "79926529-086e-4612-ac41-ad0e16ac2a4d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:27 crc kubenswrapper[4906]: I0310 00:19:27.580664 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79926529-086e-4612-ac41-ad0e16ac2a4d-kube-api-access-xg9zb" (OuterVolumeSpecName: "kube-api-access-xg9zb") pod "79926529-086e-4612-ac41-ad0e16ac2a4d" (UID: "79926529-086e-4612-ac41-ad0e16ac2a4d"). InnerVolumeSpecName "kube-api-access-xg9zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:19:27 crc kubenswrapper[4906]: I0310 00:19:27.581118 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79926529-086e-4612-ac41-ad0e16ac2a4d-util" (OuterVolumeSpecName: "util") pod "79926529-086e-4612-ac41-ad0e16ac2a4d" (UID: "79926529-086e-4612-ac41-ad0e16ac2a4d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:27 crc kubenswrapper[4906]: I0310 00:19:27.660175 4906 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79926529-086e-4612-ac41-ad0e16ac2a4d-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:27 crc kubenswrapper[4906]: I0310 00:19:27.660212 4906 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79926529-086e-4612-ac41-ad0e16ac2a4d-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:27 crc kubenswrapper[4906]: I0310 00:19:27.660222 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg9zb\" (UniqueName: \"kubernetes.io/projected/79926529-086e-4612-ac41-ad0e16ac2a4d-kube-api-access-xg9zb\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:28 crc kubenswrapper[4906]: I0310 00:19:28.077726 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" event={"ID":"79926529-086e-4612-ac41-ad0e16ac2a4d","Type":"ContainerDied","Data":"86a8aef5d0126afaa08ce5910ad97b493deff24e79508f0db84479e90787dae9"} Mar 10 00:19:28 crc kubenswrapper[4906]: I0310 00:19:28.078157 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86a8aef5d0126afaa08ce5910ad97b493deff24e79508f0db84479e90787dae9" Mar 10 00:19:28 crc kubenswrapper[4906]: I0310 00:19:28.078231 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq" Mar 10 00:19:28 crc kubenswrapper[4906]: I0310 00:19:28.860335 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w8xv6"] Mar 10 00:19:28 crc kubenswrapper[4906]: E0310 00:19:28.860586 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79926529-086e-4612-ac41-ad0e16ac2a4d" containerName="extract" Mar 10 00:19:28 crc kubenswrapper[4906]: I0310 00:19:28.860597 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="79926529-086e-4612-ac41-ad0e16ac2a4d" containerName="extract" Mar 10 00:19:28 crc kubenswrapper[4906]: E0310 00:19:28.860606 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79926529-086e-4612-ac41-ad0e16ac2a4d" containerName="pull" Mar 10 00:19:28 crc kubenswrapper[4906]: I0310 00:19:28.860611 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="79926529-086e-4612-ac41-ad0e16ac2a4d" containerName="pull" Mar 10 00:19:28 crc kubenswrapper[4906]: E0310 00:19:28.860628 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79926529-086e-4612-ac41-ad0e16ac2a4d" containerName="util" Mar 10 00:19:28 crc kubenswrapper[4906]: I0310 00:19:28.860653 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="79926529-086e-4612-ac41-ad0e16ac2a4d" containerName="util" Mar 10 00:19:28 crc kubenswrapper[4906]: I0310 00:19:28.860744 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="79926529-086e-4612-ac41-ad0e16ac2a4d" containerName="extract" Mar 10 00:19:28 crc kubenswrapper[4906]: I0310 00:19:28.861520 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8xv6" Mar 10 00:19:28 crc kubenswrapper[4906]: I0310 00:19:28.869603 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w8xv6"] Mar 10 00:19:28 crc kubenswrapper[4906]: I0310 00:19:28.980042 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122e3887-7832-4429-97b1-7d3144d23b69-catalog-content\") pod \"redhat-operators-w8xv6\" (UID: \"122e3887-7832-4429-97b1-7d3144d23b69\") " pod="openshift-marketplace/redhat-operators-w8xv6" Mar 10 00:19:28 crc kubenswrapper[4906]: I0310 00:19:28.980103 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krsgf\" (UniqueName: \"kubernetes.io/projected/122e3887-7832-4429-97b1-7d3144d23b69-kube-api-access-krsgf\") pod \"redhat-operators-w8xv6\" (UID: \"122e3887-7832-4429-97b1-7d3144d23b69\") " pod="openshift-marketplace/redhat-operators-w8xv6" Mar 10 00:19:28 crc kubenswrapper[4906]: I0310 00:19:28.980544 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122e3887-7832-4429-97b1-7d3144d23b69-utilities\") pod \"redhat-operators-w8xv6\" (UID: \"122e3887-7832-4429-97b1-7d3144d23b69\") " pod="openshift-marketplace/redhat-operators-w8xv6" Mar 10 00:19:29 crc kubenswrapper[4906]: I0310 00:19:29.081602 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122e3887-7832-4429-97b1-7d3144d23b69-catalog-content\") pod \"redhat-operators-w8xv6\" (UID: \"122e3887-7832-4429-97b1-7d3144d23b69\") " pod="openshift-marketplace/redhat-operators-w8xv6" Mar 10 00:19:29 crc kubenswrapper[4906]: I0310 00:19:29.081687 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krsgf\" (UniqueName: \"kubernetes.io/projected/122e3887-7832-4429-97b1-7d3144d23b69-kube-api-access-krsgf\") pod \"redhat-operators-w8xv6\" (UID: \"122e3887-7832-4429-97b1-7d3144d23b69\") " pod="openshift-marketplace/redhat-operators-w8xv6" Mar 10 00:19:29 crc kubenswrapper[4906]: I0310 00:19:29.081719 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122e3887-7832-4429-97b1-7d3144d23b69-utilities\") pod \"redhat-operators-w8xv6\" (UID: \"122e3887-7832-4429-97b1-7d3144d23b69\") " pod="openshift-marketplace/redhat-operators-w8xv6" Mar 10 00:19:29 crc kubenswrapper[4906]: I0310 00:19:29.082272 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122e3887-7832-4429-97b1-7d3144d23b69-catalog-content\") pod \"redhat-operators-w8xv6\" (UID: \"122e3887-7832-4429-97b1-7d3144d23b69\") " pod="openshift-marketplace/redhat-operators-w8xv6" Mar 10 00:19:29 crc kubenswrapper[4906]: I0310 00:19:29.082541 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122e3887-7832-4429-97b1-7d3144d23b69-utilities\") pod \"redhat-operators-w8xv6\" (UID: \"122e3887-7832-4429-97b1-7d3144d23b69\") " pod="openshift-marketplace/redhat-operators-w8xv6" Mar 10 00:19:29 crc kubenswrapper[4906]: I0310 00:19:29.117674 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krsgf\" (UniqueName: \"kubernetes.io/projected/122e3887-7832-4429-97b1-7d3144d23b69-kube-api-access-krsgf\") pod \"redhat-operators-w8xv6\" (UID: \"122e3887-7832-4429-97b1-7d3144d23b69\") " pod="openshift-marketplace/redhat-operators-w8xv6" Mar 10 00:19:29 crc kubenswrapper[4906]: I0310 00:19:29.180106 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8xv6" Mar 10 00:19:30 crc kubenswrapper[4906]: I0310 00:19:30.057720 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w8xv6"] Mar 10 00:19:30 crc kubenswrapper[4906]: I0310 00:19:30.092350 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-5557947dc7-2qnwb" event={"ID":"fe3dde6b-0c57-436b-a07d-9d976199d739","Type":"ContainerStarted","Data":"92338a86913d4dbc84eae46d0d568dda2c6f3857e9437163cfe1afbe194e2ba3"} Mar 10 00:19:30 crc kubenswrapper[4906]: I0310 00:19:30.095289 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8xv6" event={"ID":"122e3887-7832-4429-97b1-7d3144d23b69","Type":"ContainerStarted","Data":"6c13a0529952b80e983dd0d2f41b2eabdb5ded63a1d1e6fc0479c96cefd4376c"} Mar 10 00:19:30 crc kubenswrapper[4906]: I0310 00:19:30.502163 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:19:30 crc kubenswrapper[4906]: I0310 00:19:30.502235 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:19:30 crc kubenswrapper[4906]: I0310 00:19:30.920282 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-4gjdk" Mar 10 00:19:30 crc kubenswrapper[4906]: I0310 00:19:30.959328 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-5557947dc7-2qnwb" podStartSLOduration=7.518375293 podStartE2EDuration="12.959312139s" podCreationTimestamp="2026-03-10 00:19:18 +0000 UTC" firstStartedPulling="2026-03-10 00:19:24.463946894 +0000 UTC m=+790.611842006" lastFinishedPulling="2026-03-10 00:19:29.90488375 +0000 UTC m=+796.052778852" observedRunningTime="2026-03-10 00:19:30.114122629 +0000 UTC m=+796.262017751" watchObservedRunningTime="2026-03-10 00:19:30.959312139 +0000 UTC m=+797.107207251" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.109444 4906 generic.go:334] "Generic (PLEG): container finished" podID="122e3887-7832-4429-97b1-7d3144d23b69" containerID="57f3991d5241be80553bce428aac2faca5153095b8538afbe830c9adfc58ec69" exitCode=0 Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.110308 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8xv6" event={"ID":"122e3887-7832-4429-97b1-7d3144d23b69","Type":"ContainerDied","Data":"57f3991d5241be80553bce428aac2faca5153095b8538afbe830c9adfc58ec69"} Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.302948 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.304492 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.307160 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.307211 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.307335 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.308078 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-m9txb" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.309004 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.309036 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.309560 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.309574 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.310009 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.331023 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.412682 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.412785 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.412831 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.412864 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.413421 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.413487 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.413520 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.413566 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.416128 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.416198 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.416223 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.416260 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.416284 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.416301 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.416455 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.517180 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.517245 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.517276 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.517307 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.517338 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.517371 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.517397 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.517415 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.517449 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.517502 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.517523 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.517550 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.517579 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.517611 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.517655 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.517886 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.518678 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.518775 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.518891 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.519198 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.519884 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.520878 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.521310 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.524976 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.525036 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.525556 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.526042 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.526361 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.529581 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.536547 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/90a4eb11-c2c3-4d86-8cac-ddccdbab507a-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"90a4eb11-c2c3-4d86-8cac-ddccdbab507a\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:31 crc kubenswrapper[4906]: I0310 00:19:31.619492 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:19:37 crc kubenswrapper[4906]: I0310 00:19:37.520856 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 10 00:19:37 crc kubenswrapper[4906]: W0310 00:19:37.538426 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90a4eb11_c2c3_4d86_8cac_ddccdbab507a.slice/crio-a2587d31405bc221bba03950913cc99194952abc8ec99ab94f1b4e2e2b00f309 WatchSource:0}: Error finding container a2587d31405bc221bba03950913cc99194952abc8ec99ab94f1b4e2e2b00f309: Status 404 returned error can't find the container with id a2587d31405bc221bba03950913cc99194952abc8ec99ab94f1b4e2e2b00f309 Mar 10 00:19:38 crc kubenswrapper[4906]: I0310 00:19:38.187718 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8xv6" event={"ID":"122e3887-7832-4429-97b1-7d3144d23b69","Type":"ContainerStarted","Data":"a9847a09ed8182ec7b147a2a39a2e5db796a96bc4992f985dc687835224638c2"} Mar 10 00:19:38 crc kubenswrapper[4906]: I0310 00:19:38.191069 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-rqjgt" event={"ID":"8a817d09-73d0-4254-b799-e7f6bdc7d1bc","Type":"ContainerStarted","Data":"7c41e221e84d6799d6bf358b927fba69d63b6f19788bf370422d253163c798b0"} Mar 10 00:19:38 crc kubenswrapper[4906]: I0310 00:19:38.192426 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"90a4eb11-c2c3-4d86-8cac-ddccdbab507a","Type":"ContainerStarted","Data":"a2587d31405bc221bba03950913cc99194952abc8ec99ab94f1b4e2e2b00f309"} Mar 10 00:19:38 crc kubenswrapper[4906]: I0310 00:19:38.235535 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-rqjgt" podStartSLOduration=10.259184738 podStartE2EDuration="23.235516572s" podCreationTimestamp="2026-03-10 00:19:15 +0000 UTC" firstStartedPulling="2026-03-10 00:19:24.100230417 +0000 UTC m=+790.248125529" lastFinishedPulling="2026-03-10 00:19:37.076562251 +0000 UTC m=+803.224457363" observedRunningTime="2026-03-10 00:19:38.230678376 +0000 UTC m=+804.378573488" watchObservedRunningTime="2026-03-10 00:19:38.235516572 +0000 UTC m=+804.383411684" Mar 10 00:19:39 crc kubenswrapper[4906]: I0310 00:19:39.200815 4906 generic.go:334] "Generic (PLEG): container finished" podID="122e3887-7832-4429-97b1-7d3144d23b69" containerID="a9847a09ed8182ec7b147a2a39a2e5db796a96bc4992f985dc687835224638c2" exitCode=0 Mar 10 00:19:39 crc kubenswrapper[4906]: I0310 00:19:39.200887 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8xv6" event={"ID":"122e3887-7832-4429-97b1-7d3144d23b69","Type":"ContainerDied","Data":"a9847a09ed8182ec7b147a2a39a2e5db796a96bc4992f985dc687835224638c2"} Mar 10 00:19:40 crc kubenswrapper[4906]: I0310 00:19:40.209310 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8xv6" event={"ID":"122e3887-7832-4429-97b1-7d3144d23b69","Type":"ContainerStarted","Data":"3a2a06e82881f0619814596c14e4be28be67c9b74577a1c2fa3e5305c553fdae"} Mar 10 00:19:40 crc kubenswrapper[4906]: I0310 00:19:40.285590 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w8xv6" podStartSLOduration=8.728674218 podStartE2EDuration="12.285570314s" podCreationTimestamp="2026-03-10 00:19:28 +0000 UTC" firstStartedPulling="2026-03-10 00:19:36.316731644 +0000 UTC m=+802.464626756" lastFinishedPulling="2026-03-10 00:19:39.87362774 +0000 UTC m=+806.021522852" observedRunningTime="2026-03-10 00:19:40.273075503 +0000 UTC m=+806.420970615" watchObservedRunningTime="2026-03-10 00:19:40.285570314 +0000 UTC m=+806.433465426" Mar 10 00:19:46 crc kubenswrapper[4906]: I0310 00:19:46.539942 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-n898l"] Mar 10 00:19:46 crc kubenswrapper[4906]: I0310 00:19:46.541230 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-n898l" Mar 10 00:19:46 crc kubenswrapper[4906]: I0310 00:19:46.544063 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 10 00:19:46 crc kubenswrapper[4906]: I0310 00:19:46.544337 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 10 00:19:46 crc kubenswrapper[4906]: I0310 00:19:46.544484 4906 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-l7kpd" Mar 10 00:19:46 crc kubenswrapper[4906]: I0310 00:19:46.551889 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-n898l"] Mar 10 00:19:46 crc kubenswrapper[4906]: I0310 00:19:46.590774 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a57b0c82-3641-400f-8ae6-50cda6f33ecc-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-n898l\" (UID: \"a57b0c82-3641-400f-8ae6-50cda6f33ecc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-n898l" Mar 10 00:19:46 crc kubenswrapper[4906]: I0310 00:19:46.591243 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrcf9\" (UniqueName: \"kubernetes.io/projected/a57b0c82-3641-400f-8ae6-50cda6f33ecc-kube-api-access-rrcf9\") pod \"cert-manager-operator-controller-manager-5586865c96-n898l\" (UID: \"a57b0c82-3641-400f-8ae6-50cda6f33ecc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-n898l" Mar 10 00:19:46 crc kubenswrapper[4906]: I0310 00:19:46.693725 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a57b0c82-3641-400f-8ae6-50cda6f33ecc-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-n898l\" (UID: \"a57b0c82-3641-400f-8ae6-50cda6f33ecc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-n898l" Mar 10 00:19:46 crc kubenswrapper[4906]: I0310 00:19:46.693808 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrcf9\" (UniqueName: \"kubernetes.io/projected/a57b0c82-3641-400f-8ae6-50cda6f33ecc-kube-api-access-rrcf9\") pod \"cert-manager-operator-controller-manager-5586865c96-n898l\" (UID: \"a57b0c82-3641-400f-8ae6-50cda6f33ecc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-n898l" Mar 10 00:19:46 crc kubenswrapper[4906]: I0310 00:19:46.694330 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a57b0c82-3641-400f-8ae6-50cda6f33ecc-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-n898l\" (UID: \"a57b0c82-3641-400f-8ae6-50cda6f33ecc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-n898l" Mar 10 00:19:46 crc kubenswrapper[4906]: I0310 00:19:46.715058 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrcf9\" (UniqueName: \"kubernetes.io/projected/a57b0c82-3641-400f-8ae6-50cda6f33ecc-kube-api-access-rrcf9\") pod \"cert-manager-operator-controller-manager-5586865c96-n898l\" (UID: \"a57b0c82-3641-400f-8ae6-50cda6f33ecc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-n898l" Mar 10 00:19:46 crc kubenswrapper[4906]: I0310 00:19:46.883006 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-n898l" Mar 10 00:19:49 crc kubenswrapper[4906]: I0310 00:19:49.182057 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w8xv6" Mar 10 00:19:49 crc kubenswrapper[4906]: I0310 00:19:49.182675 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w8xv6" Mar 10 00:19:49 crc kubenswrapper[4906]: I0310 00:19:49.233769 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w8xv6" Mar 10 00:19:49 crc kubenswrapper[4906]: I0310 00:19:49.308801 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w8xv6" Mar 10 00:19:52 crc kubenswrapper[4906]: I0310 00:19:52.446410 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w8xv6"] Mar 10 00:19:52 crc kubenswrapper[4906]: I0310 00:19:52.447089 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w8xv6" podUID="122e3887-7832-4429-97b1-7d3144d23b69" containerName="registry-server" containerID="cri-o://3a2a06e82881f0619814596c14e4be28be67c9b74577a1c2fa3e5305c553fdae" gracePeriod=2 Mar 10 00:19:53 crc kubenswrapper[4906]: I0310 00:19:53.307240 4906 generic.go:334] "Generic (PLEG): container finished" podID="122e3887-7832-4429-97b1-7d3144d23b69" containerID="3a2a06e82881f0619814596c14e4be28be67c9b74577a1c2fa3e5305c553fdae" exitCode=0 Mar 10 00:19:53 crc kubenswrapper[4906]: I0310 00:19:53.307299 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8xv6" event={"ID":"122e3887-7832-4429-97b1-7d3144d23b69","Type":"ContainerDied","Data":"3a2a06e82881f0619814596c14e4be28be67c9b74577a1c2fa3e5305c553fdae"} Mar 10 00:19:54 crc kubenswrapper[4906]: I0310 00:19:54.691033 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-n898l"] Mar 10 00:19:54 crc kubenswrapper[4906]: W0310 00:19:54.717721 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda57b0c82_3641_400f_8ae6_50cda6f33ecc.slice/crio-99328e942c5f0b807a996671ff6d3585733ff6193d54740b9cc08c7876827e5a WatchSource:0}: Error finding container 99328e942c5f0b807a996671ff6d3585733ff6193d54740b9cc08c7876827e5a: Status 404 returned error can't find the container with id 99328e942c5f0b807a996671ff6d3585733ff6193d54740b9cc08c7876827e5a Mar 10 00:19:54 crc kubenswrapper[4906]: I0310 00:19:54.741605 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8xv6" Mar 10 00:19:54 crc kubenswrapper[4906]: I0310 00:19:54.933608 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122e3887-7832-4429-97b1-7d3144d23b69-utilities\") pod \"122e3887-7832-4429-97b1-7d3144d23b69\" (UID: \"122e3887-7832-4429-97b1-7d3144d23b69\") " Mar 10 00:19:54 crc kubenswrapper[4906]: I0310 00:19:54.934507 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krsgf\" (UniqueName: \"kubernetes.io/projected/122e3887-7832-4429-97b1-7d3144d23b69-kube-api-access-krsgf\") pod \"122e3887-7832-4429-97b1-7d3144d23b69\" (UID: \"122e3887-7832-4429-97b1-7d3144d23b69\") " Mar 10 00:19:54 crc kubenswrapper[4906]: I0310 00:19:54.934597 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122e3887-7832-4429-97b1-7d3144d23b69-catalog-content\") pod \"122e3887-7832-4429-97b1-7d3144d23b69\" (UID: \"122e3887-7832-4429-97b1-7d3144d23b69\") " Mar 10 00:19:54 crc kubenswrapper[4906]: I0310 00:19:54.934802 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/122e3887-7832-4429-97b1-7d3144d23b69-utilities" (OuterVolumeSpecName: "utilities") pod "122e3887-7832-4429-97b1-7d3144d23b69" (UID: "122e3887-7832-4429-97b1-7d3144d23b69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:54 crc kubenswrapper[4906]: I0310 00:19:54.935089 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/122e3887-7832-4429-97b1-7d3144d23b69-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:54 crc kubenswrapper[4906]: I0310 00:19:54.942687 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/122e3887-7832-4429-97b1-7d3144d23b69-kube-api-access-krsgf" (OuterVolumeSpecName: "kube-api-access-krsgf") pod "122e3887-7832-4429-97b1-7d3144d23b69" (UID: "122e3887-7832-4429-97b1-7d3144d23b69"). InnerVolumeSpecName "kube-api-access-krsgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:19:55 crc kubenswrapper[4906]: I0310 00:19:55.037706 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krsgf\" (UniqueName: \"kubernetes.io/projected/122e3887-7832-4429-97b1-7d3144d23b69-kube-api-access-krsgf\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:55 crc kubenswrapper[4906]: I0310 00:19:55.078460 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/122e3887-7832-4429-97b1-7d3144d23b69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "122e3887-7832-4429-97b1-7d3144d23b69" (UID: "122e3887-7832-4429-97b1-7d3144d23b69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:19:55 crc kubenswrapper[4906]: I0310 00:19:55.138410 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/122e3887-7832-4429-97b1-7d3144d23b69-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:19:55 crc kubenswrapper[4906]: I0310 00:19:55.337245 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"90a4eb11-c2c3-4d86-8cac-ddccdbab507a","Type":"ContainerStarted","Data":"70b267fe2a6a4b3a96683eecd36fff5d491b9ac64a10976cf4227108b7841fee"} Mar 10 00:19:55 crc kubenswrapper[4906]: I0310 00:19:55.344279 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w8xv6" event={"ID":"122e3887-7832-4429-97b1-7d3144d23b69","Type":"ContainerDied","Data":"6c13a0529952b80e983dd0d2f41b2eabdb5ded63a1d1e6fc0479c96cefd4376c"} Mar 10 00:19:55 crc kubenswrapper[4906]: I0310 00:19:55.344327 4906 scope.go:117] "RemoveContainer" containerID="3a2a06e82881f0619814596c14e4be28be67c9b74577a1c2fa3e5305c553fdae" Mar 10 00:19:55 crc kubenswrapper[4906]: I0310 00:19:55.344428 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w8xv6" Mar 10 00:19:55 crc kubenswrapper[4906]: I0310 00:19:55.349040 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-n898l" event={"ID":"a57b0c82-3641-400f-8ae6-50cda6f33ecc","Type":"ContainerStarted","Data":"99328e942c5f0b807a996671ff6d3585733ff6193d54740b9cc08c7876827e5a"} Mar 10 00:19:55 crc kubenswrapper[4906]: I0310 00:19:55.376974 4906 scope.go:117] "RemoveContainer" containerID="a9847a09ed8182ec7b147a2a39a2e5db796a96bc4992f985dc687835224638c2" Mar 10 00:19:55 crc kubenswrapper[4906]: I0310 00:19:55.418268 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w8xv6"] Mar 10 00:19:55 crc kubenswrapper[4906]: I0310 00:19:55.425381 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w8xv6"] Mar 10 00:19:55 crc kubenswrapper[4906]: I0310 00:19:55.425844 4906 scope.go:117] "RemoveContainer" containerID="57f3991d5241be80553bce428aac2faca5153095b8538afbe830c9adfc58ec69" Mar 10 00:19:55 crc kubenswrapper[4906]: I0310 00:19:55.513846 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 10 00:19:55 crc kubenswrapper[4906]: I0310 00:19:55.549824 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 10 00:19:56 crc kubenswrapper[4906]: I0310 00:19:56.588175 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="122e3887-7832-4429-97b1-7d3144d23b69" path="/var/lib/kubelet/pods/122e3887-7832-4429-97b1-7d3144d23b69/volumes" Mar 10 00:19:57 crc kubenswrapper[4906]: I0310 00:19:57.365733 4906 generic.go:334] "Generic (PLEG): container finished" podID="90a4eb11-c2c3-4d86-8cac-ddccdbab507a" containerID="70b267fe2a6a4b3a96683eecd36fff5d491b9ac64a10976cf4227108b7841fee" exitCode=0 Mar 10 00:19:57 crc kubenswrapper[4906]: I0310 00:19:57.365814 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"90a4eb11-c2c3-4d86-8cac-ddccdbab507a","Type":"ContainerDied","Data":"70b267fe2a6a4b3a96683eecd36fff5d491b9ac64a10976cf4227108b7841fee"} Mar 10 00:19:58 crc kubenswrapper[4906]: I0310 00:19:58.374704 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-n898l" event={"ID":"a57b0c82-3641-400f-8ae6-50cda6f33ecc","Type":"ContainerStarted","Data":"b806256fe833776d9c46f7dcc0370121c16ef8d804e19d7d2c4a346429dff395"} Mar 10 00:19:58 crc kubenswrapper[4906]: I0310 00:19:58.377498 4906 generic.go:334] "Generic (PLEG): container finished" podID="90a4eb11-c2c3-4d86-8cac-ddccdbab507a" containerID="56cce07cc156707130127dc20cac50058c0c47da516e0dd5abe8f80fcb9d7746" exitCode=0 Mar 10 00:19:58 crc kubenswrapper[4906]: I0310 00:19:58.377555 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"90a4eb11-c2c3-4d86-8cac-ddccdbab507a","Type":"ContainerDied","Data":"56cce07cc156707130127dc20cac50058c0c47da516e0dd5abe8f80fcb9d7746"} Mar 10 00:19:58 crc kubenswrapper[4906]: I0310 00:19:58.409790 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-n898l" podStartSLOduration=9.641075017 podStartE2EDuration="12.409763318s" podCreationTimestamp="2026-03-10 00:19:46 +0000 UTC" firstStartedPulling="2026-03-10 00:19:54.721464673 +0000 UTC m=+820.869359785" lastFinishedPulling="2026-03-10 00:19:57.490152974 +0000 UTC m=+823.638048086" observedRunningTime="2026-03-10 00:19:58.406481915 +0000 UTC m=+824.554377057" watchObservedRunningTime="2026-03-10 00:19:58.409763318 +0000 UTC m=+824.557658470" Mar 10 00:19:59 crc kubenswrapper[4906]: I0310 00:19:59.386579 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"90a4eb11-c2c3-4d86-8cac-ddccdbab507a","Type":"ContainerStarted","Data":"4071ee8564f6cadeeb7c0e5fd0f3e78c6623f95690ef9bec801fe896bf0960e6"} Mar 10 00:19:59 crc kubenswrapper[4906]: I0310 00:19:59.422818 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=11.15306116 podStartE2EDuration="28.422800682s" podCreationTimestamp="2026-03-10 00:19:31 +0000 UTC" firstStartedPulling="2026-03-10 00:19:37.558211658 +0000 UTC m=+803.706106770" lastFinishedPulling="2026-03-10 00:19:54.82795118 +0000 UTC m=+820.975846292" observedRunningTime="2026-03-10 00:19:59.420304901 +0000 UTC m=+825.568200013" watchObservedRunningTime="2026-03-10 00:19:59.422800682 +0000 UTC m=+825.570695794" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.166405 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551700-8txcl"] Mar 10 00:20:00 crc kubenswrapper[4906]: E0310 00:20:00.166749 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122e3887-7832-4429-97b1-7d3144d23b69" containerName="extract-content" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.166772 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="122e3887-7832-4429-97b1-7d3144d23b69" containerName="extract-content" Mar 10 00:20:00 crc kubenswrapper[4906]: E0310 00:20:00.166789 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122e3887-7832-4429-97b1-7d3144d23b69" containerName="registry-server" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.166797 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="122e3887-7832-4429-97b1-7d3144d23b69" containerName="registry-server" Mar 10 00:20:00 crc kubenswrapper[4906]: E0310 00:20:00.166825 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122e3887-7832-4429-97b1-7d3144d23b69" containerName="extract-utilities" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.166833 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="122e3887-7832-4429-97b1-7d3144d23b69" containerName="extract-utilities" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.166959 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="122e3887-7832-4429-97b1-7d3144d23b69" containerName="registry-server" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.167652 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551700-8txcl" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.170066 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.170117 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ktdr6" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.170381 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.177439 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551700-8txcl"] Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.328894 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whthc\" (UniqueName: \"kubernetes.io/projected/f4f0a933-9b21-4d0b-8f8f-29e4c93f0336-kube-api-access-whthc\") pod \"auto-csr-approver-29551700-8txcl\" (UID: \"f4f0a933-9b21-4d0b-8f8f-29e4c93f0336\") " pod="openshift-infra/auto-csr-approver-29551700-8txcl" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.392925 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.430985 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whthc\" (UniqueName: \"kubernetes.io/projected/f4f0a933-9b21-4d0b-8f8f-29e4c93f0336-kube-api-access-whthc\") pod \"auto-csr-approver-29551700-8txcl\" (UID: \"f4f0a933-9b21-4d0b-8f8f-29e4c93f0336\") " pod="openshift-infra/auto-csr-approver-29551700-8txcl" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.459821 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whthc\" (UniqueName: \"kubernetes.io/projected/f4f0a933-9b21-4d0b-8f8f-29e4c93f0336-kube-api-access-whthc\") pod \"auto-csr-approver-29551700-8txcl\" (UID: \"f4f0a933-9b21-4d0b-8f8f-29e4c93f0336\") " pod="openshift-infra/auto-csr-approver-29551700-8txcl" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.496133 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551700-8txcl" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.502222 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.502270 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.502316 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.502906 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ebdf53f2bff9bcf61abafe2c602bdab6ed5145f512fe38143bc7c112b9a35137"} pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.502960 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" containerID="cri-o://ebdf53f2bff9bcf61abafe2c602bdab6ed5145f512fe38143bc7c112b9a35137" gracePeriod=600 Mar 10 00:20:00 crc kubenswrapper[4906]: I0310 00:20:00.788230 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551700-8txcl"] Mar 10 00:20:00 crc kubenswrapper[4906]: W0310 00:20:00.793351 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f0a933_9b21_4d0b_8f8f_29e4c93f0336.slice/crio-39f17f807e3fd4c224512a34a696ee25e1a8afd12a0f140b0600536537084f0b WatchSource:0}: Error finding container 39f17f807e3fd4c224512a34a696ee25e1a8afd12a0f140b0600536537084f0b: Status 404 returned error can't find the container with id 39f17f807e3fd4c224512a34a696ee25e1a8afd12a0f140b0600536537084f0b Mar 10 00:20:01 crc kubenswrapper[4906]: I0310 00:20:01.399694 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551700-8txcl" event={"ID":"f4f0a933-9b21-4d0b-8f8f-29e4c93f0336","Type":"ContainerStarted","Data":"39f17f807e3fd4c224512a34a696ee25e1a8afd12a0f140b0600536537084f0b"} Mar 10 00:20:01 crc kubenswrapper[4906]: I0310 00:20:01.402898 4906 generic.go:334] "Generic (PLEG): container finished" podID="72d61d35-0a64-45a5-8df3-9c429727deba" containerID="ebdf53f2bff9bcf61abafe2c602bdab6ed5145f512fe38143bc7c112b9a35137" exitCode=0 Mar 10 00:20:01 crc kubenswrapper[4906]: I0310 00:20:01.403956 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerDied","Data":"ebdf53f2bff9bcf61abafe2c602bdab6ed5145f512fe38143bc7c112b9a35137"} Mar 10 00:20:01 crc kubenswrapper[4906]: I0310 00:20:01.403996 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerStarted","Data":"90546ce32412324c82386f9bf379f32c5b7b1738d24c8c80aab36b837d2ad814"} Mar 10 00:20:01 crc kubenswrapper[4906]: I0310 00:20:01.404017 4906 scope.go:117] "RemoveContainer" containerID="ff7670643966cdfab75a0c844ba14d4d3b7f4816f0572260c65b9eddbbe62eaa" Mar 10 00:20:01 crc kubenswrapper[4906]: I0310 00:20:01.682922 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-kbnf9"] Mar 10 00:20:01 crc kubenswrapper[4906]: I0310 00:20:01.684046 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-kbnf9" Mar 10 00:20:01 crc kubenswrapper[4906]: I0310 00:20:01.687082 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 10 00:20:01 crc kubenswrapper[4906]: I0310 00:20:01.687163 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 10 00:20:01 crc kubenswrapper[4906]: I0310 00:20:01.687463 4906 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-qv6t5" Mar 10 00:20:01 crc kubenswrapper[4906]: I0310 00:20:01.709084 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-kbnf9"] Mar 10 00:20:01 crc kubenswrapper[4906]: I0310 00:20:01.755054 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd6zq\" (UniqueName: \"kubernetes.io/projected/30781eae-a8cc-4149-bee1-43feb29845ba-kube-api-access-sd6zq\") pod \"cert-manager-webhook-6888856db4-kbnf9\" (UID: \"30781eae-a8cc-4149-bee1-43feb29845ba\") " pod="cert-manager/cert-manager-webhook-6888856db4-kbnf9" Mar 10 00:20:01 crc kubenswrapper[4906]: I0310 00:20:01.755297 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/30781eae-a8cc-4149-bee1-43feb29845ba-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-kbnf9\" (UID: \"30781eae-a8cc-4149-bee1-43feb29845ba\") " pod="cert-manager/cert-manager-webhook-6888856db4-kbnf9" Mar 10 00:20:01 crc kubenswrapper[4906]: I0310 00:20:01.856753 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/30781eae-a8cc-4149-bee1-43feb29845ba-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-kbnf9\" (UID: \"30781eae-a8cc-4149-bee1-43feb29845ba\") " pod="cert-manager/cert-manager-webhook-6888856db4-kbnf9" Mar 10 00:20:01 crc kubenswrapper[4906]: I0310 00:20:01.856847 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd6zq\" (UniqueName: \"kubernetes.io/projected/30781eae-a8cc-4149-bee1-43feb29845ba-kube-api-access-sd6zq\") pod \"cert-manager-webhook-6888856db4-kbnf9\" (UID: \"30781eae-a8cc-4149-bee1-43feb29845ba\") " pod="cert-manager/cert-manager-webhook-6888856db4-kbnf9" Mar 10 00:20:01 crc kubenswrapper[4906]: I0310 00:20:01.874524 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/30781eae-a8cc-4149-bee1-43feb29845ba-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-kbnf9\" (UID: \"30781eae-a8cc-4149-bee1-43feb29845ba\") " pod="cert-manager/cert-manager-webhook-6888856db4-kbnf9" Mar 10 00:20:01 crc kubenswrapper[4906]: I0310 00:20:01.875819 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd6zq\" (UniqueName: \"kubernetes.io/projected/30781eae-a8cc-4149-bee1-43feb29845ba-kube-api-access-sd6zq\") pod \"cert-manager-webhook-6888856db4-kbnf9\" (UID: \"30781eae-a8cc-4149-bee1-43feb29845ba\") " pod="cert-manager/cert-manager-webhook-6888856db4-kbnf9" Mar 10 00:20:02 crc kubenswrapper[4906]: I0310 00:20:02.000439 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-kbnf9" Mar 10 00:20:02 crc kubenswrapper[4906]: I0310 00:20:02.416119 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551700-8txcl" event={"ID":"f4f0a933-9b21-4d0b-8f8f-29e4c93f0336","Type":"ContainerStarted","Data":"75a02f0a09d6cfec5c5dfae1f7359fc71b1fb3157d79395804cd5e88e7b94a5d"} Mar 10 00:20:02 crc kubenswrapper[4906]: I0310 00:20:02.426863 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-kbnf9"] Mar 10 00:20:02 crc kubenswrapper[4906]: I0310 00:20:02.442899 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551700-8txcl" podStartSLOduration=1.326180187 podStartE2EDuration="2.442880888s" podCreationTimestamp="2026-03-10 00:20:00 +0000 UTC" firstStartedPulling="2026-03-10 00:20:00.796253391 +0000 UTC m=+826.944148513" lastFinishedPulling="2026-03-10 00:20:01.912954112 +0000 UTC m=+828.060849214" observedRunningTime="2026-03-10 00:20:02.44085041 +0000 UTC m=+828.588745522" watchObservedRunningTime="2026-03-10 00:20:02.442880888 +0000 UTC m=+828.590775990" Mar 10 00:20:03 crc kubenswrapper[4906]: I0310 00:20:03.425465 4906 generic.go:334] "Generic (PLEG): container finished" podID="f4f0a933-9b21-4d0b-8f8f-29e4c93f0336" containerID="75a02f0a09d6cfec5c5dfae1f7359fc71b1fb3157d79395804cd5e88e7b94a5d" exitCode=0 Mar 10 00:20:03 crc kubenswrapper[4906]: I0310 00:20:03.425519 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551700-8txcl" event={"ID":"f4f0a933-9b21-4d0b-8f8f-29e4c93f0336","Type":"ContainerDied","Data":"75a02f0a09d6cfec5c5dfae1f7359fc71b1fb3157d79395804cd5e88e7b94a5d"} Mar 10 00:20:03 crc kubenswrapper[4906]: I0310 00:20:03.427693 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-kbnf9" event={"ID":"30781eae-a8cc-4149-bee1-43feb29845ba","Type":"ContainerStarted","Data":"0032fe074dbb5fb89c003b3c4445dee20278609cd6d9fb13de374323573a0c36"} Mar 10 00:20:03 crc kubenswrapper[4906]: I0310 00:20:03.826163 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-tlmd5"] Mar 10 00:20:03 crc kubenswrapper[4906]: I0310 00:20:03.827471 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-tlmd5" Mar 10 00:20:03 crc kubenswrapper[4906]: I0310 00:20:03.829888 4906 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-d8pnd" Mar 10 00:20:03 crc kubenswrapper[4906]: I0310 00:20:03.845239 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-tlmd5"] Mar 10 00:20:03 crc kubenswrapper[4906]: I0310 00:20:03.888756 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc844683-db0c-4dde-8600-17b00f2d66bd-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-tlmd5\" (UID: \"dc844683-db0c-4dde-8600-17b00f2d66bd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-tlmd5" Mar 10 00:20:03 crc kubenswrapper[4906]: I0310 00:20:03.888820 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfsgb\" (UniqueName: \"kubernetes.io/projected/dc844683-db0c-4dde-8600-17b00f2d66bd-kube-api-access-kfsgb\") pod \"cert-manager-cainjector-5545bd876-tlmd5\" (UID: \"dc844683-db0c-4dde-8600-17b00f2d66bd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-tlmd5" Mar 10 00:20:03 crc kubenswrapper[4906]: I0310 00:20:03.990524 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfsgb\" (UniqueName: \"kubernetes.io/projected/dc844683-db0c-4dde-8600-17b00f2d66bd-kube-api-access-kfsgb\") pod \"cert-manager-cainjector-5545bd876-tlmd5\" (UID: \"dc844683-db0c-4dde-8600-17b00f2d66bd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-tlmd5" Mar 10 00:20:03 crc kubenswrapper[4906]: I0310 00:20:03.990830 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc844683-db0c-4dde-8600-17b00f2d66bd-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-tlmd5\" (UID: \"dc844683-db0c-4dde-8600-17b00f2d66bd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-tlmd5" Mar 10 00:20:04 crc kubenswrapper[4906]: I0310 00:20:04.027878 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc844683-db0c-4dde-8600-17b00f2d66bd-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-tlmd5\" (UID: \"dc844683-db0c-4dde-8600-17b00f2d66bd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-tlmd5" Mar 10 00:20:04 crc kubenswrapper[4906]: I0310 00:20:04.031157 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfsgb\" (UniqueName: \"kubernetes.io/projected/dc844683-db0c-4dde-8600-17b00f2d66bd-kube-api-access-kfsgb\") pod \"cert-manager-cainjector-5545bd876-tlmd5\" (UID: \"dc844683-db0c-4dde-8600-17b00f2d66bd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-tlmd5" Mar 10 00:20:04 crc kubenswrapper[4906]: I0310 00:20:04.153261 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-tlmd5" Mar 10 00:20:04 crc kubenswrapper[4906]: I0310 00:20:04.474396 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-tlmd5"] Mar 10 00:20:04 crc kubenswrapper[4906]: I0310 00:20:04.684408 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551700-8txcl" Mar 10 00:20:04 crc kubenswrapper[4906]: I0310 00:20:04.710095 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whthc\" (UniqueName: \"kubernetes.io/projected/f4f0a933-9b21-4d0b-8f8f-29e4c93f0336-kube-api-access-whthc\") pod \"f4f0a933-9b21-4d0b-8f8f-29e4c93f0336\" (UID: \"f4f0a933-9b21-4d0b-8f8f-29e4c93f0336\") " Mar 10 00:20:04 crc kubenswrapper[4906]: I0310 00:20:04.717995 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f0a933-9b21-4d0b-8f8f-29e4c93f0336-kube-api-access-whthc" (OuterVolumeSpecName: "kube-api-access-whthc") pod "f4f0a933-9b21-4d0b-8f8f-29e4c93f0336" (UID: "f4f0a933-9b21-4d0b-8f8f-29e4c93f0336"). InnerVolumeSpecName "kube-api-access-whthc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:20:04 crc kubenswrapper[4906]: I0310 00:20:04.811465 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whthc\" (UniqueName: \"kubernetes.io/projected/f4f0a933-9b21-4d0b-8f8f-29e4c93f0336-kube-api-access-whthc\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:05 crc kubenswrapper[4906]: I0310 00:20:05.445111 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551700-8txcl" event={"ID":"f4f0a933-9b21-4d0b-8f8f-29e4c93f0336","Type":"ContainerDied","Data":"39f17f807e3fd4c224512a34a696ee25e1a8afd12a0f140b0600536537084f0b"} Mar 10 00:20:05 crc kubenswrapper[4906]: I0310 00:20:05.445180 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39f17f807e3fd4c224512a34a696ee25e1a8afd12a0f140b0600536537084f0b" Mar 10 00:20:05 crc kubenswrapper[4906]: I0310 00:20:05.445136 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551700-8txcl" Mar 10 00:20:05 crc kubenswrapper[4906]: I0310 00:20:05.446982 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-tlmd5" event={"ID":"dc844683-db0c-4dde-8600-17b00f2d66bd","Type":"ContainerStarted","Data":"a69cc65a8b754112939e2c33216b9317ebda82f4fe36831559a7258fccff8ce1"} Mar 10 00:20:05 crc kubenswrapper[4906]: I0310 00:20:05.505402 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551694-vvnsl"] Mar 10 00:20:05 crc kubenswrapper[4906]: I0310 00:20:05.510843 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551694-vvnsl"] Mar 10 00:20:06 crc kubenswrapper[4906]: I0310 00:20:06.593462 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f53421f-d498-4bfa-8043-678e1083105e" path="/var/lib/kubelet/pods/8f53421f-d498-4bfa-8043-678e1083105e/volumes" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.776484 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:07 crc kubenswrapper[4906]: E0310 00:20:07.777056 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f0a933-9b21-4d0b-8f8f-29e4c93f0336" containerName="oc" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.777069 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f0a933-9b21-4d0b-8f8f-29e4c93f0336" containerName="oc" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.777192 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f0a933-9b21-4d0b-8f8f-29e4c93f0336" containerName="oc" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.777845 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.780029 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.780278 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.780854 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.782983 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-7plhd" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.793854 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.955913 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.955973 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpnxz\" (UniqueName: \"kubernetes.io/projected/71ddc589-a97a-40d1-8d1c-dd826c4cc828-kube-api-access-lpnxz\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.955997 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/71ddc589-a97a-40d1-8d1c-dd826c4cc828-builder-dockercfg-7plhd-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.956019 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.956044 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.956066 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.956083 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.956105 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/71ddc589-a97a-40d1-8d1c-dd826c4cc828-builder-dockercfg-7plhd-push\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.956124 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/71ddc589-a97a-40d1-8d1c-dd826c4cc828-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.956148 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.956167 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:07 crc kubenswrapper[4906]: I0310 00:20:07.956214 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/71ddc589-a97a-40d1-8d1c-dd826c4cc828-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.057501 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/71ddc589-a97a-40d1-8d1c-dd826c4cc828-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.057567 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.057597 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpnxz\" (UniqueName: \"kubernetes.io/projected/71ddc589-a97a-40d1-8d1c-dd826c4cc828-kube-api-access-lpnxz\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.057621 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/71ddc589-a97a-40d1-8d1c-dd826c4cc828-builder-dockercfg-7plhd-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.057656 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.057659 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/71ddc589-a97a-40d1-8d1c-dd826c4cc828-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.057681 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.057750 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.057777 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.057806 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/71ddc589-a97a-40d1-8d1c-dd826c4cc828-builder-dockercfg-7plhd-push\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.057827 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/71ddc589-a97a-40d1-8d1c-dd826c4cc828-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.057854 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.057873 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.058048 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.058137 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/71ddc589-a97a-40d1-8d1c-dd826c4cc828-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.058281 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.058438 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.058651 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.058706 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.059211 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.059566 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.065486 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/71ddc589-a97a-40d1-8d1c-dd826c4cc828-builder-dockercfg-7plhd-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.070622 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/71ddc589-a97a-40d1-8d1c-dd826c4cc828-builder-dockercfg-7plhd-push\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.074883 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpnxz\" (UniqueName: \"kubernetes.io/projected/71ddc589-a97a-40d1-8d1c-dd826c4cc828-kube-api-access-lpnxz\") pod \"service-telemetry-operator-1-build\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:08 crc kubenswrapper[4906]: I0310 00:20:08.093961 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:09 crc kubenswrapper[4906]: I0310 00:20:09.651573 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:10 crc kubenswrapper[4906]: I0310 00:20:10.483750 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"71ddc589-a97a-40d1-8d1c-dd826c4cc828","Type":"ContainerStarted","Data":"193b39d779fbcb6de72abbbd2bf1316f09cceccaba91065a1d44f024eda04911"} Mar 10 00:20:10 crc kubenswrapper[4906]: I0310 00:20:10.485874 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-kbnf9" event={"ID":"30781eae-a8cc-4149-bee1-43feb29845ba","Type":"ContainerStarted","Data":"91d5d2161c20ec078b38e4975c1ab30d94730650de2b373b706c01245f064641"} Mar 10 00:20:10 crc kubenswrapper[4906]: I0310 00:20:10.486366 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-kbnf9" Mar 10 00:20:10 crc kubenswrapper[4906]: I0310 00:20:10.493589 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-tlmd5" event={"ID":"dc844683-db0c-4dde-8600-17b00f2d66bd","Type":"ContainerStarted","Data":"c1a1210d33da8f3827ea29aba9958d5cff42bb262dc4765fea80a2a25bf0f042"} Mar 10 00:20:10 crc kubenswrapper[4906]: I0310 00:20:10.525894 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-kbnf9" podStartSLOduration=2.305700324 podStartE2EDuration="9.52587429s" podCreationTimestamp="2026-03-10 00:20:01 +0000 UTC" firstStartedPulling="2026-03-10 00:20:02.437121816 +0000 UTC m=+828.585016928" lastFinishedPulling="2026-03-10 00:20:09.657295782 +0000 UTC m=+835.805190894" observedRunningTime="2026-03-10 00:20:10.507186354 +0000 UTC m=+836.655081466" watchObservedRunningTime="2026-03-10 00:20:10.52587429 +0000 UTC m=+836.673769402" Mar 10 00:20:10 crc kubenswrapper[4906]: I0310 00:20:10.526237 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-tlmd5" podStartSLOduration=2.2770329719999998 podStartE2EDuration="7.52623162s" podCreationTimestamp="2026-03-10 00:20:03 +0000 UTC" firstStartedPulling="2026-03-10 00:20:04.503587151 +0000 UTC m=+830.651482283" lastFinishedPulling="2026-03-10 00:20:09.752785819 +0000 UTC m=+835.900680931" observedRunningTime="2026-03-10 00:20:10.522357851 +0000 UTC m=+836.670252953" watchObservedRunningTime="2026-03-10 00:20:10.52623162 +0000 UTC m=+836.674126732" Mar 10 00:20:11 crc kubenswrapper[4906]: I0310 00:20:11.714780 4906 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="90a4eb11-c2c3-4d86-8cac-ddccdbab507a" containerName="elasticsearch" probeResult="failure" output=< Mar 10 00:20:11 crc kubenswrapper[4906]: {"timestamp": "2026-03-10T00:20:11+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 10 00:20:11 crc kubenswrapper[4906]: > Mar 10 00:20:16 crc kubenswrapper[4906]: I0310 00:20:16.561866 4906 generic.go:334] "Generic (PLEG): container finished" podID="71ddc589-a97a-40d1-8d1c-dd826c4cc828" containerID="21e49d9d46dbd282ad2ade48d3ec58b1fc6bee887de4cba9621b9af9af71e820" exitCode=0 Mar 10 00:20:16 crc kubenswrapper[4906]: I0310 00:20:16.561935 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"71ddc589-a97a-40d1-8d1c-dd826c4cc828","Type":"ContainerDied","Data":"21e49d9d46dbd282ad2ade48d3ec58b1fc6bee887de4cba9621b9af9af71e820"} Mar 10 00:20:17 crc kubenswrapper[4906]: I0310 00:20:17.004511 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-kbnf9" Mar 10 00:20:17 crc kubenswrapper[4906]: I0310 00:20:17.128449 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Mar 10 00:20:17 crc kubenswrapper[4906]: I0310 00:20:17.573404 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"71ddc589-a97a-40d1-8d1c-dd826c4cc828","Type":"ContainerStarted","Data":"7b5da962466667d0b7064050344bad582835f960f2edfa9a0e3bc3ff6a0356d5"} Mar 10 00:20:17 crc kubenswrapper[4906]: I0310 00:20:17.621023 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-1-build" podStartSLOduration=4.56542253 podStartE2EDuration="10.620998136s" podCreationTimestamp="2026-03-10 00:20:07 +0000 UTC" firstStartedPulling="2026-03-10 00:20:09.671511772 +0000 UTC m=+835.819406884" lastFinishedPulling="2026-03-10 00:20:15.727087388 +0000 UTC m=+841.874982490" observedRunningTime="2026-03-10 00:20:17.61331779 +0000 UTC m=+843.761212902" watchObservedRunningTime="2026-03-10 00:20:17.620998136 +0000 UTC m=+843.768893268" Mar 10 00:20:18 crc kubenswrapper[4906]: I0310 00:20:18.055489 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.588514 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="71ddc589-a97a-40d1-8d1c-dd826c4cc828" containerName="docker-build" containerID="cri-o://7b5da962466667d0b7064050344bad582835f960f2edfa9a0e3bc3ff6a0356d5" gracePeriod=30 Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.680998 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.683351 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.688801 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.690605 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.691090 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.722517 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.865870 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-builder-dockercfg-7plhd-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.866363 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.866472 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.866761 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.866859 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.866945 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.867000 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.867077 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.867175 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-builder-dockercfg-7plhd-push\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.867226 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwn77\" (UniqueName: \"kubernetes.io/projected/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-kube-api-access-zwn77\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.867301 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.867381 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.969045 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.969150 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-builder-dockercfg-7plhd-push\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.969195 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwn77\" (UniqueName: \"kubernetes.io/projected/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-kube-api-access-zwn77\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.969250 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.969299 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.969418 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-builder-dockercfg-7plhd-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.969461 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.969496 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.969535 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.969576 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.969627 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.969702 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.969184 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.970296 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.970539 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.970682 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.971147 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.971783 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.971868 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.972070 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.972215 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.976469 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-builder-dockercfg-7plhd-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:19 crc kubenswrapper[4906]: I0310 00:20:19.985445 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-builder-dockercfg-7plhd-push\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:20 crc kubenswrapper[4906]: I0310 00:20:20.003781 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwn77\" (UniqueName: \"kubernetes.io/projected/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-kube-api-access-zwn77\") pod \"service-telemetry-operator-2-build\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:20 crc kubenswrapper[4906]: I0310 00:20:20.027063 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:20:20 crc kubenswrapper[4906]: I0310 00:20:20.605355 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-kbpfz"] Mar 10 00:20:20 crc kubenswrapper[4906]: I0310 00:20:20.607170 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-kbpfz" Mar 10 00:20:20 crc kubenswrapper[4906]: I0310 00:20:20.610360 4906 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-6zllr" Mar 10 00:20:20 crc kubenswrapper[4906]: I0310 00:20:20.629373 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-kbpfz"] Mar 10 00:20:20 crc kubenswrapper[4906]: I0310 00:20:20.783005 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e31ca248-2d64-4a9b-82d0-e374779ccb46-bound-sa-token\") pod \"cert-manager-545d4d4674-kbpfz\" (UID: \"e31ca248-2d64-4a9b-82d0-e374779ccb46\") " pod="cert-manager/cert-manager-545d4d4674-kbpfz" Mar 10 00:20:20 crc kubenswrapper[4906]: I0310 00:20:20.783098 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx4t4\" (UniqueName: \"kubernetes.io/projected/e31ca248-2d64-4a9b-82d0-e374779ccb46-kube-api-access-jx4t4\") pod \"cert-manager-545d4d4674-kbpfz\" (UID: \"e31ca248-2d64-4a9b-82d0-e374779ccb46\") " pod="cert-manager/cert-manager-545d4d4674-kbpfz" Mar 10 00:20:20 crc kubenswrapper[4906]: I0310 00:20:20.885264 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e31ca248-2d64-4a9b-82d0-e374779ccb46-bound-sa-token\") pod \"cert-manager-545d4d4674-kbpfz\" (UID: \"e31ca248-2d64-4a9b-82d0-e374779ccb46\") " pod="cert-manager/cert-manager-545d4d4674-kbpfz" Mar 10 00:20:20 crc kubenswrapper[4906]: I0310 00:20:20.885343 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx4t4\" (UniqueName: \"kubernetes.io/projected/e31ca248-2d64-4a9b-82d0-e374779ccb46-kube-api-access-jx4t4\") pod \"cert-manager-545d4d4674-kbpfz\" (UID: \"e31ca248-2d64-4a9b-82d0-e374779ccb46\") " pod="cert-manager/cert-manager-545d4d4674-kbpfz" Mar 10 00:20:20 crc kubenswrapper[4906]: I0310 00:20:20.910884 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e31ca248-2d64-4a9b-82d0-e374779ccb46-bound-sa-token\") pod \"cert-manager-545d4d4674-kbpfz\" (UID: \"e31ca248-2d64-4a9b-82d0-e374779ccb46\") " pod="cert-manager/cert-manager-545d4d4674-kbpfz" Mar 10 00:20:20 crc kubenswrapper[4906]: I0310 00:20:20.911111 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx4t4\" (UniqueName: \"kubernetes.io/projected/e31ca248-2d64-4a9b-82d0-e374779ccb46-kube-api-access-jx4t4\") pod \"cert-manager-545d4d4674-kbpfz\" (UID: \"e31ca248-2d64-4a9b-82d0-e374779ccb46\") " pod="cert-manager/cert-manager-545d4d4674-kbpfz" Mar 10 00:20:20 crc kubenswrapper[4906]: I0310 00:20:20.934778 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-kbpfz" Mar 10 00:20:21 crc kubenswrapper[4906]: I0310 00:20:21.518286 4906 scope.go:117] "RemoveContainer" containerID="81e618ed887c40423b6e3116ed9cea0c7ae9a0fad654dc75a1cb225744f296ad" Mar 10 00:20:21 crc kubenswrapper[4906]: I0310 00:20:21.944140 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-kbpfz"] Mar 10 00:20:21 crc kubenswrapper[4906]: W0310 00:20:21.950489 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode31ca248_2d64_4a9b_82d0_e374779ccb46.slice/crio-842e0752ed0b2d8be2fd560be73ab24a9199bac6c7047ad0e0bcfba5bfa72faa WatchSource:0}: Error finding container 842e0752ed0b2d8be2fd560be73ab24a9199bac6c7047ad0e0bcfba5bfa72faa: Status 404 returned error can't find the container with id 842e0752ed0b2d8be2fd560be73ab24a9199bac6c7047ad0e0bcfba5bfa72faa Mar 10 00:20:22 crc kubenswrapper[4906]: I0310 00:20:22.123564 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 10 00:20:22 crc kubenswrapper[4906]: W0310 00:20:22.138143 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2b34ffe_7101_460a_b2ed_b7c3a3e9725b.slice/crio-289f8f06e9c49e92b3257348305f4974dcbc81201de064287ffa9bed598133de WatchSource:0}: Error finding container 289f8f06e9c49e92b3257348305f4974dcbc81201de064287ffa9bed598133de: Status 404 returned error can't find the container with id 289f8f06e9c49e92b3257348305f4974dcbc81201de064287ffa9bed598133de Mar 10 00:20:22 crc kubenswrapper[4906]: I0310 00:20:22.619058 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-kbpfz" event={"ID":"e31ca248-2d64-4a9b-82d0-e374779ccb46","Type":"ContainerStarted","Data":"842e0752ed0b2d8be2fd560be73ab24a9199bac6c7047ad0e0bcfba5bfa72faa"} Mar 10 00:20:22 crc kubenswrapper[4906]: I0310 00:20:22.621006 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b","Type":"ContainerStarted","Data":"289f8f06e9c49e92b3257348305f4974dcbc81201de064287ffa9bed598133de"} Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.641961 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_71ddc589-a97a-40d1-8d1c-dd826c4cc828/docker-build/0.log" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.643023 4906 generic.go:334] "Generic (PLEG): container finished" podID="71ddc589-a97a-40d1-8d1c-dd826c4cc828" containerID="7b5da962466667d0b7064050344bad582835f960f2edfa9a0e3bc3ff6a0356d5" exitCode=1 Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.643142 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"71ddc589-a97a-40d1-8d1c-dd826c4cc828","Type":"ContainerDied","Data":"7b5da962466667d0b7064050344bad582835f960f2edfa9a0e3bc3ff6a0356d5"} Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.644951 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-kbpfz" event={"ID":"e31ca248-2d64-4a9b-82d0-e374779ccb46","Type":"ContainerStarted","Data":"246d74c073a86864812e5b922c586b814e4cf77fe937898af691ef307fa9ea01"} Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.646949 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b","Type":"ContainerStarted","Data":"2c6e2df28eeb915b8e5ab3f79a1e226d0256e6f7ef52f87aa8dcac0d529e6026"} Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.663202 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-kbpfz" podStartSLOduration=3.663183886 podStartE2EDuration="3.663183886s" podCreationTimestamp="2026-03-10 00:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:20:23.661380785 +0000 UTC m=+849.809275907" watchObservedRunningTime="2026-03-10 00:20:23.663183886 +0000 UTC m=+849.811079008" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.805261 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_71ddc589-a97a-40d1-8d1c-dd826c4cc828/docker-build/0.log" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.806121 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.942275 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/71ddc589-a97a-40d1-8d1c-dd826c4cc828-builder-dockercfg-7plhd-push\") pod \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.942350 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-ca-bundles\") pod \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.942413 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-proxy-ca-bundles\") pod \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.942458 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-container-storage-root\") pod \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.942506 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-container-storage-run\") pod \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.942586 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-blob-cache\") pod \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.942655 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-system-configs\") pod \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.942694 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/71ddc589-a97a-40d1-8d1c-dd826c4cc828-builder-dockercfg-7plhd-pull\") pod \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.942720 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/71ddc589-a97a-40d1-8d1c-dd826c4cc828-node-pullsecrets\") pod \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.942925 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-buildworkdir\") pod \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.942965 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/71ddc589-a97a-40d1-8d1c-dd826c4cc828-buildcachedir\") pod \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.943485 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "71ddc589-a97a-40d1-8d1c-dd826c4cc828" (UID: "71ddc589-a97a-40d1-8d1c-dd826c4cc828"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.943907 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71ddc589-a97a-40d1-8d1c-dd826c4cc828-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "71ddc589-a97a-40d1-8d1c-dd826c4cc828" (UID: "71ddc589-a97a-40d1-8d1c-dd826c4cc828"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.944386 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "71ddc589-a97a-40d1-8d1c-dd826c4cc828" (UID: "71ddc589-a97a-40d1-8d1c-dd826c4cc828"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.944469 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71ddc589-a97a-40d1-8d1c-dd826c4cc828-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "71ddc589-a97a-40d1-8d1c-dd826c4cc828" (UID: "71ddc589-a97a-40d1-8d1c-dd826c4cc828"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.944682 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "71ddc589-a97a-40d1-8d1c-dd826c4cc828" (UID: "71ddc589-a97a-40d1-8d1c-dd826c4cc828"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.945011 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "71ddc589-a97a-40d1-8d1c-dd826c4cc828" (UID: "71ddc589-a97a-40d1-8d1c-dd826c4cc828"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.945070 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpnxz\" (UniqueName: \"kubernetes.io/projected/71ddc589-a97a-40d1-8d1c-dd826c4cc828-kube-api-access-lpnxz\") pod \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\" (UID: \"71ddc589-a97a-40d1-8d1c-dd826c4cc828\") " Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.945666 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "71ddc589-a97a-40d1-8d1c-dd826c4cc828" (UID: "71ddc589-a97a-40d1-8d1c-dd826c4cc828"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.945713 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.945741 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/71ddc589-a97a-40d1-8d1c-dd826c4cc828-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.945757 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.945774 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/71ddc589-a97a-40d1-8d1c-dd826c4cc828-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.945789 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.945807 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/71ddc589-a97a-40d1-8d1c-dd826c4cc828-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.947525 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "71ddc589-a97a-40d1-8d1c-dd826c4cc828" (UID: "71ddc589-a97a-40d1-8d1c-dd826c4cc828"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.947585 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "71ddc589-a97a-40d1-8d1c-dd826c4cc828" (UID: "71ddc589-a97a-40d1-8d1c-dd826c4cc828"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.951349 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71ddc589-a97a-40d1-8d1c-dd826c4cc828-builder-dockercfg-7plhd-push" (OuterVolumeSpecName: "builder-dockercfg-7plhd-push") pod "71ddc589-a97a-40d1-8d1c-dd826c4cc828" (UID: "71ddc589-a97a-40d1-8d1c-dd826c4cc828"). InnerVolumeSpecName "builder-dockercfg-7plhd-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.951458 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ddc589-a97a-40d1-8d1c-dd826c4cc828-kube-api-access-lpnxz" (OuterVolumeSpecName: "kube-api-access-lpnxz") pod "71ddc589-a97a-40d1-8d1c-dd826c4cc828" (UID: "71ddc589-a97a-40d1-8d1c-dd826c4cc828"). InnerVolumeSpecName "kube-api-access-lpnxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:20:23 crc kubenswrapper[4906]: I0310 00:20:23.960558 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71ddc589-a97a-40d1-8d1c-dd826c4cc828-builder-dockercfg-7plhd-pull" (OuterVolumeSpecName: "builder-dockercfg-7plhd-pull") pod "71ddc589-a97a-40d1-8d1c-dd826c4cc828" (UID: "71ddc589-a97a-40d1-8d1c-dd826c4cc828"). InnerVolumeSpecName "builder-dockercfg-7plhd-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:20:24 crc kubenswrapper[4906]: I0310 00:20:24.060704 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:24 crc kubenswrapper[4906]: I0310 00:20:24.060751 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/71ddc589-a97a-40d1-8d1c-dd826c4cc828-builder-dockercfg-7plhd-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:24 crc kubenswrapper[4906]: I0310 00:20:24.060762 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpnxz\" (UniqueName: \"kubernetes.io/projected/71ddc589-a97a-40d1-8d1c-dd826c4cc828-kube-api-access-lpnxz\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:24 crc kubenswrapper[4906]: I0310 00:20:24.060771 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/71ddc589-a97a-40d1-8d1c-dd826c4cc828-builder-dockercfg-7plhd-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:24 crc kubenswrapper[4906]: I0310 00:20:24.060781 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:24 crc kubenswrapper[4906]: I0310 00:20:24.060789 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71ddc589-a97a-40d1-8d1c-dd826c4cc828-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:20:24 crc kubenswrapper[4906]: I0310 00:20:24.657852 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_71ddc589-a97a-40d1-8d1c-dd826c4cc828/docker-build/0.log" Mar 10 00:20:24 crc kubenswrapper[4906]: I0310 00:20:24.658865 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"71ddc589-a97a-40d1-8d1c-dd826c4cc828","Type":"ContainerDied","Data":"193b39d779fbcb6de72abbbd2bf1316f09cceccaba91065a1d44f024eda04911"} Mar 10 00:20:24 crc kubenswrapper[4906]: I0310 00:20:24.658934 4906 scope.go:117] "RemoveContainer" containerID="7b5da962466667d0b7064050344bad582835f960f2edfa9a0e3bc3ff6a0356d5" Mar 10 00:20:24 crc kubenswrapper[4906]: I0310 00:20:24.659038 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 10 00:20:24 crc kubenswrapper[4906]: I0310 00:20:24.690492 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:24 crc kubenswrapper[4906]: I0310 00:20:24.695335 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 10 00:20:24 crc kubenswrapper[4906]: I0310 00:20:24.709709 4906 scope.go:117] "RemoveContainer" containerID="21e49d9d46dbd282ad2ade48d3ec58b1fc6bee887de4cba9621b9af9af71e820" Mar 10 00:20:26 crc kubenswrapper[4906]: I0310 00:20:26.587000 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71ddc589-a97a-40d1-8d1c-dd826c4cc828" path="/var/lib/kubelet/pods/71ddc589-a97a-40d1-8d1c-dd826c4cc828/volumes" Mar 10 00:20:33 crc kubenswrapper[4906]: I0310 00:20:33.741008 4906 generic.go:334] "Generic (PLEG): container finished" podID="d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" containerID="2c6e2df28eeb915b8e5ab3f79a1e226d0256e6f7ef52f87aa8dcac0d529e6026" exitCode=0 Mar 10 00:20:33 crc kubenswrapper[4906]: I0310 00:20:33.741124 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b","Type":"ContainerDied","Data":"2c6e2df28eeb915b8e5ab3f79a1e226d0256e6f7ef52f87aa8dcac0d529e6026"} Mar 10 00:20:34 crc kubenswrapper[4906]: I0310 00:20:34.758010 4906 generic.go:334] "Generic (PLEG): container finished" podID="d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" containerID="fbf13cb1468bff61a7d8601725116fd0a1e8fd6e829d8728d8cc0b669e97f384" exitCode=0 Mar 10 00:20:34 crc kubenswrapper[4906]: I0310 00:20:34.758079 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b","Type":"ContainerDied","Data":"fbf13cb1468bff61a7d8601725116fd0a1e8fd6e829d8728d8cc0b669e97f384"} Mar 10 00:20:34 crc kubenswrapper[4906]: I0310 00:20:34.825604 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_d2b34ffe-7101-460a-b2ed-b7c3a3e9725b/manage-dockerfile/0.log" Mar 10 00:20:35 crc kubenswrapper[4906]: I0310 00:20:35.772168 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b","Type":"ContainerStarted","Data":"d569f871b22a55b60f2586f719ad1b3e646960872ab0b96d9ddec341fe9cc697"} Mar 10 00:20:35 crc kubenswrapper[4906]: I0310 00:20:35.829107 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=16.82908009 podStartE2EDuration="16.82908009s" podCreationTimestamp="2026-03-10 00:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:20:35.822417812 +0000 UTC m=+861.970312964" watchObservedRunningTime="2026-03-10 00:20:35.82908009 +0000 UTC m=+861.976975232" Mar 10 00:21:53 crc kubenswrapper[4906]: I0310 00:21:53.215272 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r7lrn"] Mar 10 00:21:53 crc kubenswrapper[4906]: E0310 00:21:53.217143 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ddc589-a97a-40d1-8d1c-dd826c4cc828" containerName="docker-build" Mar 10 00:21:53 crc kubenswrapper[4906]: I0310 00:21:53.217159 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ddc589-a97a-40d1-8d1c-dd826c4cc828" containerName="docker-build" Mar 10 00:21:53 crc kubenswrapper[4906]: E0310 00:21:53.217179 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ddc589-a97a-40d1-8d1c-dd826c4cc828" containerName="manage-dockerfile" Mar 10 00:21:53 crc kubenswrapper[4906]: I0310 00:21:53.217187 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ddc589-a97a-40d1-8d1c-dd826c4cc828" containerName="manage-dockerfile" Mar 10 00:21:53 crc kubenswrapper[4906]: I0310 00:21:53.217310 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ddc589-a97a-40d1-8d1c-dd826c4cc828" containerName="docker-build" Mar 10 00:21:53 crc kubenswrapper[4906]: I0310 00:21:53.218267 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7lrn" Mar 10 00:21:53 crc kubenswrapper[4906]: I0310 00:21:53.229254 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r7lrn"] Mar 10 00:21:53 crc kubenswrapper[4906]: I0310 00:21:53.282604 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe4340c4-acc2-442e-9510-fa5818aa6b73-catalog-content\") pod \"community-operators-r7lrn\" (UID: \"fe4340c4-acc2-442e-9510-fa5818aa6b73\") " pod="openshift-marketplace/community-operators-r7lrn" Mar 10 00:21:53 crc kubenswrapper[4906]: I0310 00:21:53.282674 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh2z5\" (UniqueName: \"kubernetes.io/projected/fe4340c4-acc2-442e-9510-fa5818aa6b73-kube-api-access-gh2z5\") pod \"community-operators-r7lrn\" (UID: \"fe4340c4-acc2-442e-9510-fa5818aa6b73\") " pod="openshift-marketplace/community-operators-r7lrn" Mar 10 00:21:53 crc kubenswrapper[4906]: I0310 00:21:53.282770 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe4340c4-acc2-442e-9510-fa5818aa6b73-utilities\") pod \"community-operators-r7lrn\" (UID: \"fe4340c4-acc2-442e-9510-fa5818aa6b73\") " pod="openshift-marketplace/community-operators-r7lrn" Mar 10 00:21:53 crc kubenswrapper[4906]: I0310 00:21:53.383746 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe4340c4-acc2-442e-9510-fa5818aa6b73-catalog-content\") pod \"community-operators-r7lrn\" (UID: \"fe4340c4-acc2-442e-9510-fa5818aa6b73\") " pod="openshift-marketplace/community-operators-r7lrn" Mar 10 00:21:53 crc kubenswrapper[4906]: I0310 00:21:53.383799 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh2z5\" (UniqueName: \"kubernetes.io/projected/fe4340c4-acc2-442e-9510-fa5818aa6b73-kube-api-access-gh2z5\") pod \"community-operators-r7lrn\" (UID: \"fe4340c4-acc2-442e-9510-fa5818aa6b73\") " pod="openshift-marketplace/community-operators-r7lrn" Mar 10 00:21:53 crc kubenswrapper[4906]: I0310 00:21:53.383830 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe4340c4-acc2-442e-9510-fa5818aa6b73-utilities\") pod \"community-operators-r7lrn\" (UID: \"fe4340c4-acc2-442e-9510-fa5818aa6b73\") " pod="openshift-marketplace/community-operators-r7lrn" Mar 10 00:21:53 crc kubenswrapper[4906]: I0310 00:21:53.384308 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe4340c4-acc2-442e-9510-fa5818aa6b73-catalog-content\") pod \"community-operators-r7lrn\" (UID: \"fe4340c4-acc2-442e-9510-fa5818aa6b73\") " pod="openshift-marketplace/community-operators-r7lrn" Mar 10 00:21:53 crc kubenswrapper[4906]: I0310 00:21:53.384361 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe4340c4-acc2-442e-9510-fa5818aa6b73-utilities\") pod \"community-operators-r7lrn\" (UID: \"fe4340c4-acc2-442e-9510-fa5818aa6b73\") " pod="openshift-marketplace/community-operators-r7lrn" Mar 10 00:21:53 crc kubenswrapper[4906]: I0310 00:21:53.413371 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh2z5\" (UniqueName: \"kubernetes.io/projected/fe4340c4-acc2-442e-9510-fa5818aa6b73-kube-api-access-gh2z5\") pod \"community-operators-r7lrn\" (UID: \"fe4340c4-acc2-442e-9510-fa5818aa6b73\") " pod="openshift-marketplace/community-operators-r7lrn" Mar 10 00:21:53 crc kubenswrapper[4906]: I0310 00:21:53.538683 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7lrn" Mar 10 00:21:53 crc kubenswrapper[4906]: I0310 00:21:53.865667 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r7lrn"] Mar 10 00:21:54 crc kubenswrapper[4906]: I0310 00:21:54.410697 4906 generic.go:334] "Generic (PLEG): container finished" podID="fe4340c4-acc2-442e-9510-fa5818aa6b73" containerID="8d307f3a9f5d6b4a7812dbaf6ae05b3462a68fd9909c726725a82b025c63f8d6" exitCode=0 Mar 10 00:21:54 crc kubenswrapper[4906]: I0310 00:21:54.410854 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7lrn" event={"ID":"fe4340c4-acc2-442e-9510-fa5818aa6b73","Type":"ContainerDied","Data":"8d307f3a9f5d6b4a7812dbaf6ae05b3462a68fd9909c726725a82b025c63f8d6"} Mar 10 00:21:54 crc kubenswrapper[4906]: I0310 00:21:54.411026 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7lrn" event={"ID":"fe4340c4-acc2-442e-9510-fa5818aa6b73","Type":"ContainerStarted","Data":"eec08a905982a87248af805bb658520a8d4b37b3f5dd37056c1fd5c92f9abfa0"} Mar 10 00:21:54 crc kubenswrapper[4906]: I0310 00:21:54.412610 4906 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:21:56 crc kubenswrapper[4906]: I0310 00:21:56.433223 4906 generic.go:334] "Generic (PLEG): container finished" podID="fe4340c4-acc2-442e-9510-fa5818aa6b73" containerID="7a59dc26a3e62bcc8de0df7f6234b4779f7e3c70fcb7eb7680968f930da1dce9" exitCode=0 Mar 10 00:21:56 crc kubenswrapper[4906]: I0310 00:21:56.433314 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7lrn" event={"ID":"fe4340c4-acc2-442e-9510-fa5818aa6b73","Type":"ContainerDied","Data":"7a59dc26a3e62bcc8de0df7f6234b4779f7e3c70fcb7eb7680968f930da1dce9"} Mar 10 00:21:57 crc kubenswrapper[4906]: I0310 00:21:57.441519 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7lrn" event={"ID":"fe4340c4-acc2-442e-9510-fa5818aa6b73","Type":"ContainerStarted","Data":"238d194f13bd40b7e2d399e00a8cef9fc485ec0d27cf6d51a4ddd8a3e99c09c3"} Mar 10 00:21:57 crc kubenswrapper[4906]: I0310 00:21:57.460877 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r7lrn" podStartSLOduration=1.924534429 podStartE2EDuration="4.460854472s" podCreationTimestamp="2026-03-10 00:21:53 +0000 UTC" firstStartedPulling="2026-03-10 00:21:54.412376452 +0000 UTC m=+940.560271564" lastFinishedPulling="2026-03-10 00:21:56.948696495 +0000 UTC m=+943.096591607" observedRunningTime="2026-03-10 00:21:57.456363095 +0000 UTC m=+943.604258247" watchObservedRunningTime="2026-03-10 00:21:57.460854472 +0000 UTC m=+943.608749614" Mar 10 00:21:58 crc kubenswrapper[4906]: I0310 00:21:58.450919 4906 generic.go:334] "Generic (PLEG): container finished" podID="d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" containerID="d569f871b22a55b60f2586f719ad1b3e646960872ab0b96d9ddec341fe9cc697" exitCode=0 Mar 10 00:21:58 crc kubenswrapper[4906]: I0310 00:21:58.451003 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b","Type":"ContainerDied","Data":"d569f871b22a55b60f2586f719ad1b3e646960872ab0b96d9ddec341fe9cc697"} Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.744896 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.875780 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-builder-dockercfg-7plhd-pull\") pod \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.876232 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-ca-bundles\") pod \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.876305 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-container-storage-root\") pod \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.876404 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwn77\" (UniqueName: \"kubernetes.io/projected/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-kube-api-access-zwn77\") pod \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.876474 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-node-pullsecrets\") pod \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.876539 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-system-configs\") pod \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.876612 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-builder-dockercfg-7plhd-push\") pod \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.876696 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-buildworkdir\") pod \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.876761 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-container-storage-run\") pod \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.876819 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-buildcachedir\") pod \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.876905 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-blob-cache\") pod \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.876954 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-proxy-ca-bundles\") pod \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\" (UID: \"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b\") " Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.877372 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" (UID: "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.877494 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" (UID: "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.877483 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" (UID: "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.878385 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" (UID: "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.879162 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" (UID: "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.884914 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-kube-api-access-zwn77" (OuterVolumeSpecName: "kube-api-access-zwn77") pod "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" (UID: "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b"). InnerVolumeSpecName "kube-api-access-zwn77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.884951 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-builder-dockercfg-7plhd-pull" (OuterVolumeSpecName: "builder-dockercfg-7plhd-pull") pod "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" (UID: "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b"). InnerVolumeSpecName "builder-dockercfg-7plhd-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.886770 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-builder-dockercfg-7plhd-push" (OuterVolumeSpecName: "builder-dockercfg-7plhd-push") pod "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" (UID: "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b"). InnerVolumeSpecName "builder-dockercfg-7plhd-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.892285 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" (UID: "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.930181 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" (UID: "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.978214 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.978253 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.978263 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.978272 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.978280 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-builder-dockercfg-7plhd-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.978289 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.978297 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwn77\" (UniqueName: \"kubernetes.io/projected/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-kube-api-access-zwn77\") on node \"crc\" DevicePath \"\"" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.978307 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.978315 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:21:59 crc kubenswrapper[4906]: I0310 00:21:59.978324 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-builder-dockercfg-7plhd-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.074158 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" (UID: "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.079409 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.140198 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551702-jk7xb"] Mar 10 00:22:00 crc kubenswrapper[4906]: E0310 00:22:00.140476 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" containerName="manage-dockerfile" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.140488 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" containerName="manage-dockerfile" Mar 10 00:22:00 crc kubenswrapper[4906]: E0310 00:22:00.140502 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" containerName="git-clone" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.140508 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" containerName="git-clone" Mar 10 00:22:00 crc kubenswrapper[4906]: E0310 00:22:00.140525 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" containerName="docker-build" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.140533 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" containerName="docker-build" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.140649 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" containerName="docker-build" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.141063 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551702-jk7xb" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.169441 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.170904 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.171467 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ktdr6" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.177324 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551702-jk7xb"] Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.281599 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ltlm\" (UniqueName: \"kubernetes.io/projected/6c19fcab-7928-4edf-a882-003c32d33473-kube-api-access-8ltlm\") pod \"auto-csr-approver-29551702-jk7xb\" (UID: \"6c19fcab-7928-4edf-a882-003c32d33473\") " pod="openshift-infra/auto-csr-approver-29551702-jk7xb" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.383785 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ltlm\" (UniqueName: \"kubernetes.io/projected/6c19fcab-7928-4edf-a882-003c32d33473-kube-api-access-8ltlm\") pod \"auto-csr-approver-29551702-jk7xb\" (UID: \"6c19fcab-7928-4edf-a882-003c32d33473\") " pod="openshift-infra/auto-csr-approver-29551702-jk7xb" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.408245 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ltlm\" (UniqueName: \"kubernetes.io/projected/6c19fcab-7928-4edf-a882-003c32d33473-kube-api-access-8ltlm\") pod \"auto-csr-approver-29551702-jk7xb\" (UID: \"6c19fcab-7928-4edf-a882-003c32d33473\") " pod="openshift-infra/auto-csr-approver-29551702-jk7xb" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.467026 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"d2b34ffe-7101-460a-b2ed-b7c3a3e9725b","Type":"ContainerDied","Data":"289f8f06e9c49e92b3257348305f4974dcbc81201de064287ffa9bed598133de"} Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.467085 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="289f8f06e9c49e92b3257348305f4974dcbc81201de064287ffa9bed598133de" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.467210 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.490280 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551702-jk7xb" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.502417 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.502488 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:22:00 crc kubenswrapper[4906]: I0310 00:22:00.741528 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551702-jk7xb"] Mar 10 00:22:01 crc kubenswrapper[4906]: I0310 00:22:01.482421 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551702-jk7xb" event={"ID":"6c19fcab-7928-4edf-a882-003c32d33473","Type":"ContainerStarted","Data":"8fd140e61636e4f4225cde07409397e278ee58b7bfccd06a2504950d1b541366"} Mar 10 00:22:02 crc kubenswrapper[4906]: I0310 00:22:02.647731 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b" (UID: "d2b34ffe-7101-460a-b2ed-b7c3a3e9725b"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:02 crc kubenswrapper[4906]: I0310 00:22:02.720830 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d2b34ffe-7101-460a-b2ed-b7c3a3e9725b-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:03 crc kubenswrapper[4906]: I0310 00:22:03.499547 4906 generic.go:334] "Generic (PLEG): container finished" podID="6c19fcab-7928-4edf-a882-003c32d33473" containerID="3c3988e44e458523814e6fc3937ca0a4d8f2d2cf37a310c21836f4810d53f5c1" exitCode=0 Mar 10 00:22:03 crc kubenswrapper[4906]: I0310 00:22:03.499632 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551702-jk7xb" event={"ID":"6c19fcab-7928-4edf-a882-003c32d33473","Type":"ContainerDied","Data":"3c3988e44e458523814e6fc3937ca0a4d8f2d2cf37a310c21836f4810d53f5c1"} Mar 10 00:22:03 crc kubenswrapper[4906]: I0310 00:22:03.539281 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r7lrn" Mar 10 00:22:03 crc kubenswrapper[4906]: I0310 00:22:03.539412 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r7lrn" Mar 10 00:22:03 crc kubenswrapper[4906]: I0310 00:22:03.604580 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r7lrn" Mar 10 00:22:04 crc kubenswrapper[4906]: I0310 00:22:04.587827 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r7lrn" Mar 10 00:22:04 crc kubenswrapper[4906]: I0310 00:22:04.663407 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r7lrn"] Mar 10 00:22:04 crc kubenswrapper[4906]: I0310 00:22:04.786856 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551702-jk7xb" Mar 10 00:22:04 crc kubenswrapper[4906]: I0310 00:22:04.855089 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ltlm\" (UniqueName: \"kubernetes.io/projected/6c19fcab-7928-4edf-a882-003c32d33473-kube-api-access-8ltlm\") pod \"6c19fcab-7928-4edf-a882-003c32d33473\" (UID: \"6c19fcab-7928-4edf-a882-003c32d33473\") " Mar 10 00:22:04 crc kubenswrapper[4906]: I0310 00:22:04.860937 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c19fcab-7928-4edf-a882-003c32d33473-kube-api-access-8ltlm" (OuterVolumeSpecName: "kube-api-access-8ltlm") pod "6c19fcab-7928-4edf-a882-003c32d33473" (UID: "6c19fcab-7928-4edf-a882-003c32d33473"). InnerVolumeSpecName "kube-api-access-8ltlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:22:04 crc kubenswrapper[4906]: I0310 00:22:04.956520 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ltlm\" (UniqueName: \"kubernetes.io/projected/6c19fcab-7928-4edf-a882-003c32d33473-kube-api-access-8ltlm\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.108841 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:05 crc kubenswrapper[4906]: E0310 00:22:05.109067 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c19fcab-7928-4edf-a882-003c32d33473" containerName="oc" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.109078 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c19fcab-7928-4edf-a882-003c32d33473" containerName="oc" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.109193 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c19fcab-7928-4edf-a882-003c32d33473" containerName="oc" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.109792 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.115257 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.115629 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-7plhd" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.115766 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.115876 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.134205 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.160672 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e0c307d3-7152-4589-9094-11f86de2552a-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.160719 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.160744 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.160778 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.160950 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.160991 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k227b\" (UniqueName: \"kubernetes.io/projected/e0c307d3-7152-4589-9094-11f86de2552a-kube-api-access-k227b\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.161017 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e0c307d3-7152-4589-9094-11f86de2552a-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.161049 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.161082 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/e0c307d3-7152-4589-9094-11f86de2552a-builder-dockercfg-7plhd-push\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.161200 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.161236 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.161265 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/e0c307d3-7152-4589-9094-11f86de2552a-builder-dockercfg-7plhd-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.263703 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.263770 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k227b\" (UniqueName: \"kubernetes.io/projected/e0c307d3-7152-4589-9094-11f86de2552a-kube-api-access-k227b\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.263811 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e0c307d3-7152-4589-9094-11f86de2552a-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.263851 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.263894 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/e0c307d3-7152-4589-9094-11f86de2552a-builder-dockercfg-7plhd-push\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.263997 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e0c307d3-7152-4589-9094-11f86de2552a-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.264021 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.264067 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.264187 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/e0c307d3-7152-4589-9094-11f86de2552a-builder-dockercfg-7plhd-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.264293 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e0c307d3-7152-4589-9094-11f86de2552a-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.264338 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.264765 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.264623 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e0c307d3-7152-4589-9094-11f86de2552a-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.264829 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.264494 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.264857 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.264959 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.265218 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.265236 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.265969 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.266951 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.269943 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/e0c307d3-7152-4589-9094-11f86de2552a-builder-dockercfg-7plhd-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.271586 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/e0c307d3-7152-4589-9094-11f86de2552a-builder-dockercfg-7plhd-push\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.293119 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k227b\" (UniqueName: \"kubernetes.io/projected/e0c307d3-7152-4589-9094-11f86de2552a-kube-api-access-k227b\") pod \"smart-gateway-operator-1-build\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.429798 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.520445 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551702-jk7xb" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.523966 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551702-jk7xb" event={"ID":"6c19fcab-7928-4edf-a882-003c32d33473","Type":"ContainerDied","Data":"8fd140e61636e4f4225cde07409397e278ee58b7bfccd06a2504950d1b541366"} Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.524029 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fd140e61636e4f4225cde07409397e278ee58b7bfccd06a2504950d1b541366" Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.848619 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551696-9x59s"] Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.852257 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551696-9x59s"] Mar 10 00:22:05 crc kubenswrapper[4906]: I0310 00:22:05.895895 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:06 crc kubenswrapper[4906]: I0310 00:22:06.533289 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"e0c307d3-7152-4589-9094-11f86de2552a","Type":"ContainerStarted","Data":"520babbfeede5840ee24b4c10a0c7ec8ee10995144059b9d820c9db24656a2af"} Mar 10 00:22:06 crc kubenswrapper[4906]: I0310 00:22:06.533348 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"e0c307d3-7152-4589-9094-11f86de2552a","Type":"ContainerStarted","Data":"8159f71062ac12a5160cb444563ca0d59276a75e014c4b8e1475c232d6ff6d88"} Mar 10 00:22:06 crc kubenswrapper[4906]: I0310 00:22:06.533415 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r7lrn" podUID="fe4340c4-acc2-442e-9510-fa5818aa6b73" containerName="registry-server" containerID="cri-o://238d194f13bd40b7e2d399e00a8cef9fc485ec0d27cf6d51a4ddd8a3e99c09c3" gracePeriod=2 Mar 10 00:22:06 crc kubenswrapper[4906]: I0310 00:22:06.592760 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20232528-5424-4988-b5c5-52011267a7e4" path="/var/lib/kubelet/pods/20232528-5424-4988-b5c5-52011267a7e4/volumes" Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.038793 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7lrn" Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.089204 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh2z5\" (UniqueName: \"kubernetes.io/projected/fe4340c4-acc2-442e-9510-fa5818aa6b73-kube-api-access-gh2z5\") pod \"fe4340c4-acc2-442e-9510-fa5818aa6b73\" (UID: \"fe4340c4-acc2-442e-9510-fa5818aa6b73\") " Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.089372 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe4340c4-acc2-442e-9510-fa5818aa6b73-catalog-content\") pod \"fe4340c4-acc2-442e-9510-fa5818aa6b73\" (UID: \"fe4340c4-acc2-442e-9510-fa5818aa6b73\") " Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.089424 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe4340c4-acc2-442e-9510-fa5818aa6b73-utilities\") pod \"fe4340c4-acc2-442e-9510-fa5818aa6b73\" (UID: \"fe4340c4-acc2-442e-9510-fa5818aa6b73\") " Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.091118 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe4340c4-acc2-442e-9510-fa5818aa6b73-utilities" (OuterVolumeSpecName: "utilities") pod "fe4340c4-acc2-442e-9510-fa5818aa6b73" (UID: "fe4340c4-acc2-442e-9510-fa5818aa6b73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.098838 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe4340c4-acc2-442e-9510-fa5818aa6b73-kube-api-access-gh2z5" (OuterVolumeSpecName: "kube-api-access-gh2z5") pod "fe4340c4-acc2-442e-9510-fa5818aa6b73" (UID: "fe4340c4-acc2-442e-9510-fa5818aa6b73"). InnerVolumeSpecName "kube-api-access-gh2z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.160789 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe4340c4-acc2-442e-9510-fa5818aa6b73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe4340c4-acc2-442e-9510-fa5818aa6b73" (UID: "fe4340c4-acc2-442e-9510-fa5818aa6b73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.191759 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh2z5\" (UniqueName: \"kubernetes.io/projected/fe4340c4-acc2-442e-9510-fa5818aa6b73-kube-api-access-gh2z5\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.191812 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe4340c4-acc2-442e-9510-fa5818aa6b73-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.191836 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe4340c4-acc2-442e-9510-fa5818aa6b73-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.545672 4906 generic.go:334] "Generic (PLEG): container finished" podID="fe4340c4-acc2-442e-9510-fa5818aa6b73" containerID="238d194f13bd40b7e2d399e00a8cef9fc485ec0d27cf6d51a4ddd8a3e99c09c3" exitCode=0 Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.545749 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7lrn" event={"ID":"fe4340c4-acc2-442e-9510-fa5818aa6b73","Type":"ContainerDied","Data":"238d194f13bd40b7e2d399e00a8cef9fc485ec0d27cf6d51a4ddd8a3e99c09c3"} Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.545782 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r7lrn" event={"ID":"fe4340c4-acc2-442e-9510-fa5818aa6b73","Type":"ContainerDied","Data":"eec08a905982a87248af805bb658520a8d4b37b3f5dd37056c1fd5c92f9abfa0"} Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.545785 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r7lrn" Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.545818 4906 scope.go:117] "RemoveContainer" containerID="238d194f13bd40b7e2d399e00a8cef9fc485ec0d27cf6d51a4ddd8a3e99c09c3" Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.550830 4906 generic.go:334] "Generic (PLEG): container finished" podID="e0c307d3-7152-4589-9094-11f86de2552a" containerID="520babbfeede5840ee24b4c10a0c7ec8ee10995144059b9d820c9db24656a2af" exitCode=0 Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.550900 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"e0c307d3-7152-4589-9094-11f86de2552a","Type":"ContainerDied","Data":"520babbfeede5840ee24b4c10a0c7ec8ee10995144059b9d820c9db24656a2af"} Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.598823 4906 scope.go:117] "RemoveContainer" containerID="7a59dc26a3e62bcc8de0df7f6234b4779f7e3c70fcb7eb7680968f930da1dce9" Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.642957 4906 scope.go:117] "RemoveContainer" containerID="8d307f3a9f5d6b4a7812dbaf6ae05b3462a68fd9909c726725a82b025c63f8d6" Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.645767 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r7lrn"] Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.651265 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r7lrn"] Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.667218 4906 scope.go:117] "RemoveContainer" containerID="238d194f13bd40b7e2d399e00a8cef9fc485ec0d27cf6d51a4ddd8a3e99c09c3" Mar 10 00:22:07 crc kubenswrapper[4906]: E0310 00:22:07.667713 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"238d194f13bd40b7e2d399e00a8cef9fc485ec0d27cf6d51a4ddd8a3e99c09c3\": container with ID starting with 238d194f13bd40b7e2d399e00a8cef9fc485ec0d27cf6d51a4ddd8a3e99c09c3 not found: ID does not exist" containerID="238d194f13bd40b7e2d399e00a8cef9fc485ec0d27cf6d51a4ddd8a3e99c09c3" Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.667835 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"238d194f13bd40b7e2d399e00a8cef9fc485ec0d27cf6d51a4ddd8a3e99c09c3"} err="failed to get container status \"238d194f13bd40b7e2d399e00a8cef9fc485ec0d27cf6d51a4ddd8a3e99c09c3\": rpc error: code = NotFound desc = could not find container \"238d194f13bd40b7e2d399e00a8cef9fc485ec0d27cf6d51a4ddd8a3e99c09c3\": container with ID starting with 238d194f13bd40b7e2d399e00a8cef9fc485ec0d27cf6d51a4ddd8a3e99c09c3 not found: ID does not exist" Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.667951 4906 scope.go:117] "RemoveContainer" containerID="7a59dc26a3e62bcc8de0df7f6234b4779f7e3c70fcb7eb7680968f930da1dce9" Mar 10 00:22:07 crc kubenswrapper[4906]: E0310 00:22:07.668446 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a59dc26a3e62bcc8de0df7f6234b4779f7e3c70fcb7eb7680968f930da1dce9\": container with ID starting with 7a59dc26a3e62bcc8de0df7f6234b4779f7e3c70fcb7eb7680968f930da1dce9 not found: ID does not exist" containerID="7a59dc26a3e62bcc8de0df7f6234b4779f7e3c70fcb7eb7680968f930da1dce9" Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.668536 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a59dc26a3e62bcc8de0df7f6234b4779f7e3c70fcb7eb7680968f930da1dce9"} err="failed to get container status \"7a59dc26a3e62bcc8de0df7f6234b4779f7e3c70fcb7eb7680968f930da1dce9\": rpc error: code = NotFound desc = could not find container \"7a59dc26a3e62bcc8de0df7f6234b4779f7e3c70fcb7eb7680968f930da1dce9\": container with ID starting with 7a59dc26a3e62bcc8de0df7f6234b4779f7e3c70fcb7eb7680968f930da1dce9 not found: ID does not exist" Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.668583 4906 scope.go:117] "RemoveContainer" containerID="8d307f3a9f5d6b4a7812dbaf6ae05b3462a68fd9909c726725a82b025c63f8d6" Mar 10 00:22:07 crc kubenswrapper[4906]: E0310 00:22:07.669215 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d307f3a9f5d6b4a7812dbaf6ae05b3462a68fd9909c726725a82b025c63f8d6\": container with ID starting with 8d307f3a9f5d6b4a7812dbaf6ae05b3462a68fd9909c726725a82b025c63f8d6 not found: ID does not exist" containerID="8d307f3a9f5d6b4a7812dbaf6ae05b3462a68fd9909c726725a82b025c63f8d6" Mar 10 00:22:07 crc kubenswrapper[4906]: I0310 00:22:07.669263 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d307f3a9f5d6b4a7812dbaf6ae05b3462a68fd9909c726725a82b025c63f8d6"} err="failed to get container status \"8d307f3a9f5d6b4a7812dbaf6ae05b3462a68fd9909c726725a82b025c63f8d6\": rpc error: code = NotFound desc = could not find container \"8d307f3a9f5d6b4a7812dbaf6ae05b3462a68fd9909c726725a82b025c63f8d6\": container with ID starting with 8d307f3a9f5d6b4a7812dbaf6ae05b3462a68fd9909c726725a82b025c63f8d6 not found: ID does not exist" Mar 10 00:22:08 crc kubenswrapper[4906]: I0310 00:22:08.565616 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"e0c307d3-7152-4589-9094-11f86de2552a","Type":"ContainerStarted","Data":"b16118ec7fe4f1a0ea6dac2489452940812dc188694c87a0ccf244c264eec4da"} Mar 10 00:22:08 crc kubenswrapper[4906]: I0310 00:22:08.592270 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe4340c4-acc2-442e-9510-fa5818aa6b73" path="/var/lib/kubelet/pods/fe4340c4-acc2-442e-9510-fa5818aa6b73/volumes" Mar 10 00:22:08 crc kubenswrapper[4906]: I0310 00:22:08.619061 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=3.619027048 podStartE2EDuration="3.619027048s" podCreationTimestamp="2026-03-10 00:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:22:08.609498519 +0000 UTC m=+954.757393691" watchObservedRunningTime="2026-03-10 00:22:08.619027048 +0000 UTC m=+954.766922200" Mar 10 00:22:16 crc kubenswrapper[4906]: I0310 00:22:16.019490 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:16 crc kubenswrapper[4906]: I0310 00:22:16.021011 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="e0c307d3-7152-4589-9094-11f86de2552a" containerName="docker-build" containerID="cri-o://b16118ec7fe4f1a0ea6dac2489452940812dc188694c87a0ccf244c264eec4da" gracePeriod=30 Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.642269 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 10 00:22:17 crc kubenswrapper[4906]: E0310 00:22:17.643311 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe4340c4-acc2-442e-9510-fa5818aa6b73" containerName="registry-server" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.643351 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe4340c4-acc2-442e-9510-fa5818aa6b73" containerName="registry-server" Mar 10 00:22:17 crc kubenswrapper[4906]: E0310 00:22:17.643368 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe4340c4-acc2-442e-9510-fa5818aa6b73" containerName="extract-content" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.643374 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe4340c4-acc2-442e-9510-fa5818aa6b73" containerName="extract-content" Mar 10 00:22:17 crc kubenswrapper[4906]: E0310 00:22:17.643386 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe4340c4-acc2-442e-9510-fa5818aa6b73" containerName="extract-utilities" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.643392 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe4340c4-acc2-442e-9510-fa5818aa6b73" containerName="extract-utilities" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.643498 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe4340c4-acc2-442e-9510-fa5818aa6b73" containerName="registry-server" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.644505 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.647802 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.647992 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.648042 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.651160 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_e0c307d3-7152-4589-9094-11f86de2552a/docker-build/0.log" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.652261 4906 generic.go:334] "Generic (PLEG): container finished" podID="e0c307d3-7152-4589-9094-11f86de2552a" containerID="b16118ec7fe4f1a0ea6dac2489452940812dc188694c87a0ccf244c264eec4da" exitCode=1 Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.652304 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"e0c307d3-7152-4589-9094-11f86de2552a","Type":"ContainerDied","Data":"b16118ec7fe4f1a0ea6dac2489452940812dc188694c87a0ccf244c264eec4da"} Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.674687 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.703192 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.703233 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.703256 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.703392 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.703475 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-builder-dockercfg-7plhd-push\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.703530 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.703559 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.703589 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-builder-dockercfg-7plhd-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.703623 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8j6v\" (UniqueName: \"kubernetes.io/projected/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-kube-api-access-c8j6v\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.703665 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.703705 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.703778 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.751787 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_e0c307d3-7152-4589-9094-11f86de2552a/docker-build/0.log" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.752201 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.806254 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/e0c307d3-7152-4589-9094-11f86de2552a-builder-dockercfg-7plhd-push\") pod \"e0c307d3-7152-4589-9094-11f86de2552a\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.806324 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k227b\" (UniqueName: \"kubernetes.io/projected/e0c307d3-7152-4589-9094-11f86de2552a-kube-api-access-k227b\") pod \"e0c307d3-7152-4589-9094-11f86de2552a\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.806385 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-system-configs\") pod \"e0c307d3-7152-4589-9094-11f86de2552a\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.806409 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-build-blob-cache\") pod \"e0c307d3-7152-4589-9094-11f86de2552a\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.806508 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-buildworkdir\") pod \"e0c307d3-7152-4589-9094-11f86de2552a\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.806556 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e0c307d3-7152-4589-9094-11f86de2552a-node-pullsecrets\") pod \"e0c307d3-7152-4589-9094-11f86de2552a\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.806580 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-proxy-ca-bundles\") pod \"e0c307d3-7152-4589-9094-11f86de2552a\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.806602 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/e0c307d3-7152-4589-9094-11f86de2552a-builder-dockercfg-7plhd-pull\") pod \"e0c307d3-7152-4589-9094-11f86de2552a\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.806675 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e0c307d3-7152-4589-9094-11f86de2552a-buildcachedir\") pod \"e0c307d3-7152-4589-9094-11f86de2552a\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.806745 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-container-storage-root\") pod \"e0c307d3-7152-4589-9094-11f86de2552a\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.806796 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-ca-bundles\") pod \"e0c307d3-7152-4589-9094-11f86de2552a\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.806828 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-container-storage-run\") pod \"e0c307d3-7152-4589-9094-11f86de2552a\" (UID: \"e0c307d3-7152-4589-9094-11f86de2552a\") " Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.807060 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-builder-dockercfg-7plhd-push\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.807126 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.807157 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.807208 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-builder-dockercfg-7plhd-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.807235 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8j6v\" (UniqueName: \"kubernetes.io/projected/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-kube-api-access-c8j6v\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.807276 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.807302 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.807361 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.807434 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.807463 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.807505 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.807544 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.807795 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.808809 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e0c307d3-7152-4589-9094-11f86de2552a" (UID: "e0c307d3-7152-4589-9094-11f86de2552a"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.808887 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0c307d3-7152-4589-9094-11f86de2552a-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e0c307d3-7152-4589-9094-11f86de2552a" (UID: "e0c307d3-7152-4589-9094-11f86de2552a"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.810456 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e0c307d3-7152-4589-9094-11f86de2552a" (UID: "e0c307d3-7152-4589-9094-11f86de2552a"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.812302 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0c307d3-7152-4589-9094-11f86de2552a-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e0c307d3-7152-4589-9094-11f86de2552a" (UID: "e0c307d3-7152-4589-9094-11f86de2552a"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.812840 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.813246 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.813409 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.813512 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.813573 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.814942 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.815673 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e0c307d3-7152-4589-9094-11f86de2552a" (UID: "e0c307d3-7152-4589-9094-11f86de2552a"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.817445 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.821480 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-builder-dockercfg-7plhd-push\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.828832 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0c307d3-7152-4589-9094-11f86de2552a-kube-api-access-k227b" (OuterVolumeSpecName: "kube-api-access-k227b") pod "e0c307d3-7152-4589-9094-11f86de2552a" (UID: "e0c307d3-7152-4589-9094-11f86de2552a"). InnerVolumeSpecName "kube-api-access-k227b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.828938 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0c307d3-7152-4589-9094-11f86de2552a-builder-dockercfg-7plhd-pull" (OuterVolumeSpecName: "builder-dockercfg-7plhd-pull") pod "e0c307d3-7152-4589-9094-11f86de2552a" (UID: "e0c307d3-7152-4589-9094-11f86de2552a"). InnerVolumeSpecName "builder-dockercfg-7plhd-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.829390 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.829506 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0c307d3-7152-4589-9094-11f86de2552a-builder-dockercfg-7plhd-push" (OuterVolumeSpecName: "builder-dockercfg-7plhd-push") pod "e0c307d3-7152-4589-9094-11f86de2552a" (UID: "e0c307d3-7152-4589-9094-11f86de2552a"). InnerVolumeSpecName "builder-dockercfg-7plhd-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.833970 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e0c307d3-7152-4589-9094-11f86de2552a" (UID: "e0c307d3-7152-4589-9094-11f86de2552a"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.835126 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e0c307d3-7152-4589-9094-11f86de2552a" (UID: "e0c307d3-7152-4589-9094-11f86de2552a"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.840802 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-builder-dockercfg-7plhd-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.857292 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8j6v\" (UniqueName: \"kubernetes.io/projected/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-kube-api-access-c8j6v\") pod \"smart-gateway-operator-2-build\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.908742 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.909076 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.909086 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e0c307d3-7152-4589-9094-11f86de2552a-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.909094 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/e0c307d3-7152-4589-9094-11f86de2552a-builder-dockercfg-7plhd-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.909104 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e0c307d3-7152-4589-9094-11f86de2552a-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.909112 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.909121 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.909132 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/e0c307d3-7152-4589-9094-11f86de2552a-builder-dockercfg-7plhd-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.909140 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k227b\" (UniqueName: \"kubernetes.io/projected/e0c307d3-7152-4589-9094-11f86de2552a-kube-api-access-k227b\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.909149 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e0c307d3-7152-4589-9094-11f86de2552a-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.969927 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:22:17 crc kubenswrapper[4906]: I0310 00:22:17.991914 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e0c307d3-7152-4589-9094-11f86de2552a" (UID: "e0c307d3-7152-4589-9094-11f86de2552a"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:18 crc kubenswrapper[4906]: I0310 00:22:18.010528 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:18 crc kubenswrapper[4906]: I0310 00:22:18.261302 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e0c307d3-7152-4589-9094-11f86de2552a" (UID: "e0c307d3-7152-4589-9094-11f86de2552a"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:22:18 crc kubenswrapper[4906]: I0310 00:22:18.314877 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e0c307d3-7152-4589-9094-11f86de2552a-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:22:18 crc kubenswrapper[4906]: I0310 00:22:18.401289 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 10 00:22:18 crc kubenswrapper[4906]: I0310 00:22:18.660702 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_e0c307d3-7152-4589-9094-11f86de2552a/docker-build/0.log" Mar 10 00:22:18 crc kubenswrapper[4906]: I0310 00:22:18.661042 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"e0c307d3-7152-4589-9094-11f86de2552a","Type":"ContainerDied","Data":"8159f71062ac12a5160cb444563ca0d59276a75e014c4b8e1475c232d6ff6d88"} Mar 10 00:22:18 crc kubenswrapper[4906]: I0310 00:22:18.661078 4906 scope.go:117] "RemoveContainer" containerID="b16118ec7fe4f1a0ea6dac2489452940812dc188694c87a0ccf244c264eec4da" Mar 10 00:22:18 crc kubenswrapper[4906]: I0310 00:22:18.661176 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 10 00:22:18 crc kubenswrapper[4906]: I0310 00:22:18.662804 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8","Type":"ContainerStarted","Data":"2b40d396bf6ffac2ed473b04f1e313db9ab1dc94bddd2f684f45e1e39e150c12"} Mar 10 00:22:18 crc kubenswrapper[4906]: I0310 00:22:18.684219 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:18 crc kubenswrapper[4906]: I0310 00:22:18.697494 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 10 00:22:18 crc kubenswrapper[4906]: I0310 00:22:18.747654 4906 scope.go:117] "RemoveContainer" containerID="520babbfeede5840ee24b4c10a0c7ec8ee10995144059b9d820c9db24656a2af" Mar 10 00:22:19 crc kubenswrapper[4906]: I0310 00:22:19.680963 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8","Type":"ContainerStarted","Data":"71ab76c8f17829bcd2dfe0d6dddba092730ed5e6308bb423bf96eb4e3b53b051"} Mar 10 00:22:20 crc kubenswrapper[4906]: I0310 00:22:20.599779 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0c307d3-7152-4589-9094-11f86de2552a" path="/var/lib/kubelet/pods/e0c307d3-7152-4589-9094-11f86de2552a/volumes" Mar 10 00:22:20 crc kubenswrapper[4906]: I0310 00:22:20.692074 4906 generic.go:334] "Generic (PLEG): container finished" podID="0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" containerID="71ab76c8f17829bcd2dfe0d6dddba092730ed5e6308bb423bf96eb4e3b53b051" exitCode=0 Mar 10 00:22:20 crc kubenswrapper[4906]: I0310 00:22:20.692144 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8","Type":"ContainerDied","Data":"71ab76c8f17829bcd2dfe0d6dddba092730ed5e6308bb423bf96eb4e3b53b051"} Mar 10 00:22:21 crc kubenswrapper[4906]: I0310 00:22:21.704490 4906 generic.go:334] "Generic (PLEG): container finished" podID="0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" containerID="d6ff1894b2f8a271632c88370502829b6ba09ca34ab61775a5a9ec3e405ba13e" exitCode=0 Mar 10 00:22:21 crc kubenswrapper[4906]: I0310 00:22:21.704762 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8","Type":"ContainerDied","Data":"d6ff1894b2f8a271632c88370502829b6ba09ca34ab61775a5a9ec3e405ba13e"} Mar 10 00:22:21 crc kubenswrapper[4906]: I0310 00:22:21.732602 4906 scope.go:117] "RemoveContainer" containerID="8336e4e5a87d14df8bdf132e789b69e30b4934d32a495d0bc21caea4fb7d1a68" Mar 10 00:22:21 crc kubenswrapper[4906]: I0310 00:22:21.770989 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_0dd2f361-2dd6-42e5-a582-1aaa6492e5c8/manage-dockerfile/0.log" Mar 10 00:22:22 crc kubenswrapper[4906]: I0310 00:22:22.724266 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8","Type":"ContainerStarted","Data":"fe6bb9890d3efa1bcb7e4aa02d8e9db368e9ab0b3e850997d32b2572a3d7d939"} Mar 10 00:22:22 crc kubenswrapper[4906]: I0310 00:22:22.775364 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=5.775335962 podStartE2EDuration="5.775335962s" podCreationTimestamp="2026-03-10 00:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:22:22.761952104 +0000 UTC m=+968.909847256" watchObservedRunningTime="2026-03-10 00:22:22.775335962 +0000 UTC m=+968.923231114" Mar 10 00:22:30 crc kubenswrapper[4906]: I0310 00:22:30.502780 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:22:30 crc kubenswrapper[4906]: I0310 00:22:30.503517 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:23:00 crc kubenswrapper[4906]: I0310 00:23:00.502426 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:23:00 crc kubenswrapper[4906]: I0310 00:23:00.503093 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:23:00 crc kubenswrapper[4906]: I0310 00:23:00.503146 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:23:00 crc kubenswrapper[4906]: I0310 00:23:00.503735 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90546ce32412324c82386f9bf379f32c5b7b1738d24c8c80aab36b837d2ad814"} pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:23:00 crc kubenswrapper[4906]: I0310 00:23:00.503798 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" containerID="cri-o://90546ce32412324c82386f9bf379f32c5b7b1738d24c8c80aab36b837d2ad814" gracePeriod=600 Mar 10 00:23:00 crc kubenswrapper[4906]: I0310 00:23:00.992511 4906 generic.go:334] "Generic (PLEG): container finished" podID="72d61d35-0a64-45a5-8df3-9c429727deba" containerID="90546ce32412324c82386f9bf379f32c5b7b1738d24c8c80aab36b837d2ad814" exitCode=0 Mar 10 00:23:00 crc kubenswrapper[4906]: I0310 00:23:00.992576 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerDied","Data":"90546ce32412324c82386f9bf379f32c5b7b1738d24c8c80aab36b837d2ad814"} Mar 10 00:23:00 crc kubenswrapper[4906]: I0310 00:23:00.993192 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerStarted","Data":"ecbb89ce657e5a301b803c163bbd70823102707b301c704efb62181f5820ef4b"} Mar 10 00:23:00 crc kubenswrapper[4906]: I0310 00:23:00.993228 4906 scope.go:117] "RemoveContainer" containerID="ebdf53f2bff9bcf61abafe2c602bdab6ed5145f512fe38143bc7c112b9a35137" Mar 10 00:23:23 crc kubenswrapper[4906]: I0310 00:23:23.379719 4906 generic.go:334] "Generic (PLEG): container finished" podID="0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" containerID="fe6bb9890d3efa1bcb7e4aa02d8e9db368e9ab0b3e850997d32b2572a3d7d939" exitCode=0 Mar 10 00:23:23 crc kubenswrapper[4906]: I0310 00:23:23.379813 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8","Type":"ContainerDied","Data":"fe6bb9890d3efa1bcb7e4aa02d8e9db368e9ab0b3e850997d32b2572a3d7d939"} Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.828261 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.852375 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-buildworkdir\") pod \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.852437 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-builder-dockercfg-7plhd-pull\") pod \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.852501 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-system-configs\") pod \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.852527 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-blob-cache\") pod \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.852570 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-container-storage-root\") pod \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.852604 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-container-storage-run\") pod \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.852625 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-node-pullsecrets\") pod \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.852672 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-buildcachedir\") pod \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.852709 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-proxy-ca-bundles\") pod \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.852737 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-builder-dockercfg-7plhd-push\") pod \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.852765 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8j6v\" (UniqueName: \"kubernetes.io/projected/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-kube-api-access-c8j6v\") pod \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.852810 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-ca-bundles\") pod \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\" (UID: \"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8\") " Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.853179 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" (UID: "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.853366 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" (UID: "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.853956 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" (UID: "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.854454 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" (UID: "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.854578 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.854595 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.854607 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.854619 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.855115 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" (UID: "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.855365 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" (UID: "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.855795 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" (UID: "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.860234 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-kube-api-access-c8j6v" (OuterVolumeSpecName: "kube-api-access-c8j6v") pod "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" (UID: "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8"). InnerVolumeSpecName "kube-api-access-c8j6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.860406 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-builder-dockercfg-7plhd-push" (OuterVolumeSpecName: "builder-dockercfg-7plhd-push") pod "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" (UID: "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8"). InnerVolumeSpecName "builder-dockercfg-7plhd-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.868644 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-builder-dockercfg-7plhd-pull" (OuterVolumeSpecName: "builder-dockercfg-7plhd-pull") pod "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" (UID: "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8"). InnerVolumeSpecName "builder-dockercfg-7plhd-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.955947 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-builder-dockercfg-7plhd-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.955987 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8j6v\" (UniqueName: \"kubernetes.io/projected/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-kube-api-access-c8j6v\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.956000 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.956011 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.956024 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-builder-dockercfg-7plhd-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:24 crc kubenswrapper[4906]: I0310 00:23:24.956037 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:25 crc kubenswrapper[4906]: I0310 00:23:25.041106 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" (UID: "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:23:25 crc kubenswrapper[4906]: I0310 00:23:25.057152 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:25 crc kubenswrapper[4906]: I0310 00:23:25.400175 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"0dd2f361-2dd6-42e5-a582-1aaa6492e5c8","Type":"ContainerDied","Data":"2b40d396bf6ffac2ed473b04f1e313db9ab1dc94bddd2f684f45e1e39e150c12"} Mar 10 00:23:25 crc kubenswrapper[4906]: I0310 00:23:25.400230 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b40d396bf6ffac2ed473b04f1e313db9ab1dc94bddd2f684f45e1e39e150c12" Mar 10 00:23:25 crc kubenswrapper[4906]: I0310 00:23:25.400841 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 10 00:23:27 crc kubenswrapper[4906]: I0310 00:23:27.071020 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" (UID: "0dd2f361-2dd6-42e5-a582-1aaa6492e5c8"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:23:27 crc kubenswrapper[4906]: I0310 00:23:27.085717 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0dd2f361-2dd6-42e5-a582-1aaa6492e5c8-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.657936 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:23:29 crc kubenswrapper[4906]: E0310 00:23:29.658591 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c307d3-7152-4589-9094-11f86de2552a" containerName="manage-dockerfile" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.658608 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c307d3-7152-4589-9094-11f86de2552a" containerName="manage-dockerfile" Mar 10 00:23:29 crc kubenswrapper[4906]: E0310 00:23:29.658622 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c307d3-7152-4589-9094-11f86de2552a" containerName="docker-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.658629 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c307d3-7152-4589-9094-11f86de2552a" containerName="docker-build" Mar 10 00:23:29 crc kubenswrapper[4906]: E0310 00:23:29.658664 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" containerName="docker-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.658674 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" containerName="docker-build" Mar 10 00:23:29 crc kubenswrapper[4906]: E0310 00:23:29.658692 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" containerName="git-clone" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.658700 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" containerName="git-clone" Mar 10 00:23:29 crc kubenswrapper[4906]: E0310 00:23:29.658714 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" containerName="manage-dockerfile" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.658721 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" containerName="manage-dockerfile" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.658861 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0c307d3-7152-4589-9094-11f86de2552a" containerName="docker-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.658873 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd2f361-2dd6-42e5-a582-1aaa6492e5c8" containerName="docker-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.659609 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.661954 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.661983 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.662946 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-7plhd" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.665074 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.681428 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.820694 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-system-configs\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.820776 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.821188 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.821393 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c871b4bb-d234-4ceb-81d9-369ceb657836-buildcachedir\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.821466 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-container-storage-root\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.821533 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.821612 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/c871b4bb-d234-4ceb-81d9-369ceb657836-builder-dockercfg-7plhd-pull\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.821736 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q9ss\" (UniqueName: \"kubernetes.io/projected/c871b4bb-d234-4ceb-81d9-369ceb657836-kube-api-access-7q9ss\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.821813 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c871b4bb-d234-4ceb-81d9-369ceb657836-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.821871 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-buildworkdir\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.821934 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/c871b4bb-d234-4ceb-81d9-369ceb657836-builder-dockercfg-7plhd-push\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.821990 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-container-storage-run\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.923241 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c871b4bb-d234-4ceb-81d9-369ceb657836-buildcachedir\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.923613 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-container-storage-root\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.923829 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.924052 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/c871b4bb-d234-4ceb-81d9-369ceb657836-builder-dockercfg-7plhd-pull\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.924226 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q9ss\" (UniqueName: \"kubernetes.io/projected/c871b4bb-d234-4ceb-81d9-369ceb657836-kube-api-access-7q9ss\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.924404 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c871b4bb-d234-4ceb-81d9-369ceb657836-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.924571 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-buildworkdir\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.924788 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/c871b4bb-d234-4ceb-81d9-369ceb657836-builder-dockercfg-7plhd-push\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.925067 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-container-storage-run\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.925220 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-system-configs\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.924842 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c871b4bb-d234-4ceb-81d9-369ceb657836-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.924972 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-container-storage-root\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.925376 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.925607 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-buildworkdir\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.923429 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c871b4bb-d234-4ceb-81d9-369ceb657836-buildcachedir\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.925832 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.926011 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-system-configs\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.924585 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.926356 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.927200 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-container-storage-run\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.929239 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.943332 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/c871b4bb-d234-4ceb-81d9-369ceb657836-builder-dockercfg-7plhd-push\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.943521 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/c871b4bb-d234-4ceb-81d9-369ceb657836-builder-dockercfg-7plhd-pull\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.952020 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q9ss\" (UniqueName: \"kubernetes.io/projected/c871b4bb-d234-4ceb-81d9-369ceb657836-kube-api-access-7q9ss\") pod \"sg-core-1-build\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " pod="service-telemetry/sg-core-1-build" Mar 10 00:23:29 crc kubenswrapper[4906]: I0310 00:23:29.982544 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 10 00:23:30 crc kubenswrapper[4906]: I0310 00:23:30.231919 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:23:30 crc kubenswrapper[4906]: I0310 00:23:30.435002 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"c871b4bb-d234-4ceb-81d9-369ceb657836","Type":"ContainerStarted","Data":"9e7246eb171a3b187452829b25516aca009e0212be980866bf96df1c3f5e48bb"} Mar 10 00:23:31 crc kubenswrapper[4906]: I0310 00:23:31.447309 4906 generic.go:334] "Generic (PLEG): container finished" podID="c871b4bb-d234-4ceb-81d9-369ceb657836" containerID="f0dd06c604ad116add3bb409ef099fb5f5bb262a9c5fdc11086cf92b9aaf8207" exitCode=0 Mar 10 00:23:31 crc kubenswrapper[4906]: I0310 00:23:31.447430 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"c871b4bb-d234-4ceb-81d9-369ceb657836","Type":"ContainerDied","Data":"f0dd06c604ad116add3bb409ef099fb5f5bb262a9c5fdc11086cf92b9aaf8207"} Mar 10 00:23:32 crc kubenswrapper[4906]: I0310 00:23:32.462032 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"c871b4bb-d234-4ceb-81d9-369ceb657836","Type":"ContainerStarted","Data":"f6605a0735c8a38b8fa1b8d6235fa5b610532db34fc6c9083dfe28100d554150"} Mar 10 00:23:32 crc kubenswrapper[4906]: I0310 00:23:32.507398 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=3.507371365 podStartE2EDuration="3.507371365s" podCreationTimestamp="2026-03-10 00:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:23:32.494003747 +0000 UTC m=+1038.641898879" watchObservedRunningTime="2026-03-10 00:23:32.507371365 +0000 UTC m=+1038.655266507" Mar 10 00:23:39 crc kubenswrapper[4906]: I0310 00:23:39.951976 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:23:39 crc kubenswrapper[4906]: I0310 00:23:39.953062 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="c871b4bb-d234-4ceb-81d9-369ceb657836" containerName="docker-build" containerID="cri-o://f6605a0735c8a38b8fa1b8d6235fa5b610532db34fc6c9083dfe28100d554150" gracePeriod=30 Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.380327 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_c871b4bb-d234-4ceb-81d9-369ceb657836/docker-build/0.log" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.381029 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.524718 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_c871b4bb-d234-4ceb-81d9-369ceb657836/docker-build/0.log" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.525249 4906 generic.go:334] "Generic (PLEG): container finished" podID="c871b4bb-d234-4ceb-81d9-369ceb657836" containerID="f6605a0735c8a38b8fa1b8d6235fa5b610532db34fc6c9083dfe28100d554150" exitCode=1 Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.525308 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"c871b4bb-d234-4ceb-81d9-369ceb657836","Type":"ContainerDied","Data":"f6605a0735c8a38b8fa1b8d6235fa5b610532db34fc6c9083dfe28100d554150"} Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.525349 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"c871b4bb-d234-4ceb-81d9-369ceb657836","Type":"ContainerDied","Data":"9e7246eb171a3b187452829b25516aca009e0212be980866bf96df1c3f5e48bb"} Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.525376 4906 scope.go:117] "RemoveContainer" containerID="f6605a0735c8a38b8fa1b8d6235fa5b610532db34fc6c9083dfe28100d554150" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.525557 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.564261 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-buildworkdir\") pod \"c871b4bb-d234-4ceb-81d9-369ceb657836\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.564963 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-system-configs\") pod \"c871b4bb-d234-4ceb-81d9-369ceb657836\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.565034 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-build-blob-cache\") pod \"c871b4bb-d234-4ceb-81d9-369ceb657836\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.565082 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c871b4bb-d234-4ceb-81d9-369ceb657836-node-pullsecrets\") pod \"c871b4bb-d234-4ceb-81d9-369ceb657836\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.565107 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "c871b4bb-d234-4ceb-81d9-369ceb657836" (UID: "c871b4bb-d234-4ceb-81d9-369ceb657836"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.565148 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-ca-bundles\") pod \"c871b4bb-d234-4ceb-81d9-369ceb657836\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.565201 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-container-storage-run\") pod \"c871b4bb-d234-4ceb-81d9-369ceb657836\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.565259 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/c871b4bb-d234-4ceb-81d9-369ceb657836-builder-dockercfg-7plhd-pull\") pod \"c871b4bb-d234-4ceb-81d9-369ceb657836\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.565321 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q9ss\" (UniqueName: \"kubernetes.io/projected/c871b4bb-d234-4ceb-81d9-369ceb657836-kube-api-access-7q9ss\") pod \"c871b4bb-d234-4ceb-81d9-369ceb657836\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.565507 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c871b4bb-d234-4ceb-81d9-369ceb657836-buildcachedir\") pod \"c871b4bb-d234-4ceb-81d9-369ceb657836\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.565613 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-container-storage-root\") pod \"c871b4bb-d234-4ceb-81d9-369ceb657836\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.565755 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/c871b4bb-d234-4ceb-81d9-369ceb657836-builder-dockercfg-7plhd-push\") pod \"c871b4bb-d234-4ceb-81d9-369ceb657836\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.565831 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-proxy-ca-bundles\") pod \"c871b4bb-d234-4ceb-81d9-369ceb657836\" (UID: \"c871b4bb-d234-4ceb-81d9-369ceb657836\") " Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.566426 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.567564 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "c871b4bb-d234-4ceb-81d9-369ceb657836" (UID: "c871b4bb-d234-4ceb-81d9-369ceb657836"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.567677 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c871b4bb-d234-4ceb-81d9-369ceb657836-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "c871b4bb-d234-4ceb-81d9-369ceb657836" (UID: "c871b4bb-d234-4ceb-81d9-369ceb657836"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.568581 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "c871b4bb-d234-4ceb-81d9-369ceb657836" (UID: "c871b4bb-d234-4ceb-81d9-369ceb657836"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.570210 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "c871b4bb-d234-4ceb-81d9-369ceb657836" (UID: "c871b4bb-d234-4ceb-81d9-369ceb657836"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.570859 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c871b4bb-d234-4ceb-81d9-369ceb657836-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "c871b4bb-d234-4ceb-81d9-369ceb657836" (UID: "c871b4bb-d234-4ceb-81d9-369ceb657836"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.571701 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "c871b4bb-d234-4ceb-81d9-369ceb657836" (UID: "c871b4bb-d234-4ceb-81d9-369ceb657836"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.578168 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c871b4bb-d234-4ceb-81d9-369ceb657836-builder-dockercfg-7plhd-push" (OuterVolumeSpecName: "builder-dockercfg-7plhd-push") pod "c871b4bb-d234-4ceb-81d9-369ceb657836" (UID: "c871b4bb-d234-4ceb-81d9-369ceb657836"). InnerVolumeSpecName "builder-dockercfg-7plhd-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.580807 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c871b4bb-d234-4ceb-81d9-369ceb657836-kube-api-access-7q9ss" (OuterVolumeSpecName: "kube-api-access-7q9ss") pod "c871b4bb-d234-4ceb-81d9-369ceb657836" (UID: "c871b4bb-d234-4ceb-81d9-369ceb657836"). InnerVolumeSpecName "kube-api-access-7q9ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.586194 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c871b4bb-d234-4ceb-81d9-369ceb657836-builder-dockercfg-7plhd-pull" (OuterVolumeSpecName: "builder-dockercfg-7plhd-pull") pod "c871b4bb-d234-4ceb-81d9-369ceb657836" (UID: "c871b4bb-d234-4ceb-81d9-369ceb657836"). InnerVolumeSpecName "builder-dockercfg-7plhd-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.602179 4906 scope.go:117] "RemoveContainer" containerID="f0dd06c604ad116add3bb409ef099fb5f5bb262a9c5fdc11086cf92b9aaf8207" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.661269 4906 scope.go:117] "RemoveContainer" containerID="f6605a0735c8a38b8fa1b8d6235fa5b610532db34fc6c9083dfe28100d554150" Mar 10 00:23:40 crc kubenswrapper[4906]: E0310 00:23:40.661866 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6605a0735c8a38b8fa1b8d6235fa5b610532db34fc6c9083dfe28100d554150\": container with ID starting with f6605a0735c8a38b8fa1b8d6235fa5b610532db34fc6c9083dfe28100d554150 not found: ID does not exist" containerID="f6605a0735c8a38b8fa1b8d6235fa5b610532db34fc6c9083dfe28100d554150" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.661954 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6605a0735c8a38b8fa1b8d6235fa5b610532db34fc6c9083dfe28100d554150"} err="failed to get container status \"f6605a0735c8a38b8fa1b8d6235fa5b610532db34fc6c9083dfe28100d554150\": rpc error: code = NotFound desc = could not find container \"f6605a0735c8a38b8fa1b8d6235fa5b610532db34fc6c9083dfe28100d554150\": container with ID starting with f6605a0735c8a38b8fa1b8d6235fa5b610532db34fc6c9083dfe28100d554150 not found: ID does not exist" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.662010 4906 scope.go:117] "RemoveContainer" containerID="f0dd06c604ad116add3bb409ef099fb5f5bb262a9c5fdc11086cf92b9aaf8207" Mar 10 00:23:40 crc kubenswrapper[4906]: E0310 00:23:40.662900 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0dd06c604ad116add3bb409ef099fb5f5bb262a9c5fdc11086cf92b9aaf8207\": container with ID starting with f0dd06c604ad116add3bb409ef099fb5f5bb262a9c5fdc11086cf92b9aaf8207 not found: ID does not exist" containerID="f0dd06c604ad116add3bb409ef099fb5f5bb262a9c5fdc11086cf92b9aaf8207" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.662943 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0dd06c604ad116add3bb409ef099fb5f5bb262a9c5fdc11086cf92b9aaf8207"} err="failed to get container status \"f0dd06c604ad116add3bb409ef099fb5f5bb262a9c5fdc11086cf92b9aaf8207\": rpc error: code = NotFound desc = could not find container \"f0dd06c604ad116add3bb409ef099fb5f5bb262a9c5fdc11086cf92b9aaf8207\": container with ID starting with f0dd06c604ad116add3bb409ef099fb5f5bb262a9c5fdc11086cf92b9aaf8207 not found: ID does not exist" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.667318 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/c871b4bb-d234-4ceb-81d9-369ceb657836-builder-dockercfg-7plhd-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.667347 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.667362 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.667380 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c871b4bb-d234-4ceb-81d9-369ceb657836-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.667397 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c871b4bb-d234-4ceb-81d9-369ceb657836-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.667412 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.667432 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/c871b4bb-d234-4ceb-81d9-369ceb657836-builder-dockercfg-7plhd-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.667444 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q9ss\" (UniqueName: \"kubernetes.io/projected/c871b4bb-d234-4ceb-81d9-369ceb657836-kube-api-access-7q9ss\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.667455 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c871b4bb-d234-4ceb-81d9-369ceb657836-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.723565 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "c871b4bb-d234-4ceb-81d9-369ceb657836" (UID: "c871b4bb-d234-4ceb-81d9-369ceb657836"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.731967 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "c871b4bb-d234-4ceb-81d9-369ceb657836" (UID: "c871b4bb-d234-4ceb-81d9-369ceb657836"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.769158 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.769927 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c871b4bb-d234-4ceb-81d9-369ceb657836-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.862362 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:23:40 crc kubenswrapper[4906]: I0310 00:23:40.865950 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.649992 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 10 00:23:41 crc kubenswrapper[4906]: E0310 00:23:41.650440 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c871b4bb-d234-4ceb-81d9-369ceb657836" containerName="docker-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.650473 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c871b4bb-d234-4ceb-81d9-369ceb657836" containerName="docker-build" Mar 10 00:23:41 crc kubenswrapper[4906]: E0310 00:23:41.650509 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c871b4bb-d234-4ceb-81d9-369ceb657836" containerName="manage-dockerfile" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.650529 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="c871b4bb-d234-4ceb-81d9-369ceb657836" containerName="manage-dockerfile" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.650836 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="c871b4bb-d234-4ceb-81d9-369ceb657836" containerName="docker-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.657521 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.660196 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.660412 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.660756 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.660862 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-7plhd" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.686457 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.786116 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-container-storage-root\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.786162 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bbf3660c-b068-476f-94c3-8a55f1ce1679-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.786187 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-buildworkdir\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.786212 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bbf3660c-b068-476f-94c3-8a55f1ce1679-buildcachedir\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.786247 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.786274 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dck8j\" (UniqueName: \"kubernetes.io/projected/bbf3660c-b068-476f-94c3-8a55f1ce1679-kube-api-access-dck8j\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.786297 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.786489 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-container-storage-run\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.786691 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/bbf3660c-b068-476f-94c3-8a55f1ce1679-builder-dockercfg-7plhd-push\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.786799 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.786955 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/bbf3660c-b068-476f-94c3-8a55f1ce1679-builder-dockercfg-7plhd-pull\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.787019 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-system-configs\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.888473 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.888557 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dck8j\" (UniqueName: \"kubernetes.io/projected/bbf3660c-b068-476f-94c3-8a55f1ce1679-kube-api-access-dck8j\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.888593 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.888669 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-container-storage-run\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.888739 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/bbf3660c-b068-476f-94c3-8a55f1ce1679-builder-dockercfg-7plhd-push\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.888773 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.888826 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/bbf3660c-b068-476f-94c3-8a55f1ce1679-builder-dockercfg-7plhd-pull\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.888864 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-system-configs\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.889674 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.889159 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-container-storage-run\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.889520 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.889354 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.889785 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-container-storage-root\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.889811 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bbf3660c-b068-476f-94c3-8a55f1ce1679-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.889855 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-buildworkdir\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.889882 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bbf3660c-b068-476f-94c3-8a55f1ce1679-buildcachedir\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.889953 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bbf3660c-b068-476f-94c3-8a55f1ce1679-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.889991 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bbf3660c-b068-476f-94c3-8a55f1ce1679-buildcachedir\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.890158 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-buildworkdir\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.890382 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-container-storage-root\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.890525 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-system-configs\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.896442 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/bbf3660c-b068-476f-94c3-8a55f1ce1679-builder-dockercfg-7plhd-pull\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.896493 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/bbf3660c-b068-476f-94c3-8a55f1ce1679-builder-dockercfg-7plhd-push\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.925230 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dck8j\" (UniqueName: \"kubernetes.io/projected/bbf3660c-b068-476f-94c3-8a55f1ce1679-kube-api-access-dck8j\") pod \"sg-core-2-build\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " pod="service-telemetry/sg-core-2-build" Mar 10 00:23:41 crc kubenswrapper[4906]: I0310 00:23:41.975739 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 10 00:23:42 crc kubenswrapper[4906]: I0310 00:23:42.436483 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 10 00:23:42 crc kubenswrapper[4906]: I0310 00:23:42.549331 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"bbf3660c-b068-476f-94c3-8a55f1ce1679","Type":"ContainerStarted","Data":"d5169b29240101e3adb46b4bdfb045e26096cd4ecaba9d2f834d33b5a2dc9552"} Mar 10 00:23:42 crc kubenswrapper[4906]: I0310 00:23:42.590460 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c871b4bb-d234-4ceb-81d9-369ceb657836" path="/var/lib/kubelet/pods/c871b4bb-d234-4ceb-81d9-369ceb657836/volumes" Mar 10 00:23:43 crc kubenswrapper[4906]: I0310 00:23:43.558960 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"bbf3660c-b068-476f-94c3-8a55f1ce1679","Type":"ContainerStarted","Data":"81305789dae09ff7427ec646859288aaee1b0889b70bb0f4f0614b023c749125"} Mar 10 00:23:44 crc kubenswrapper[4906]: I0310 00:23:44.569033 4906 generic.go:334] "Generic (PLEG): container finished" podID="bbf3660c-b068-476f-94c3-8a55f1ce1679" containerID="81305789dae09ff7427ec646859288aaee1b0889b70bb0f4f0614b023c749125" exitCode=0 Mar 10 00:23:44 crc kubenswrapper[4906]: I0310 00:23:44.569149 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"bbf3660c-b068-476f-94c3-8a55f1ce1679","Type":"ContainerDied","Data":"81305789dae09ff7427ec646859288aaee1b0889b70bb0f4f0614b023c749125"} Mar 10 00:23:45 crc kubenswrapper[4906]: I0310 00:23:45.577347 4906 generic.go:334] "Generic (PLEG): container finished" podID="bbf3660c-b068-476f-94c3-8a55f1ce1679" containerID="2694aac51bdd36221d0b9b54620fa10bc929401b4275e1a82e79d533f08595e3" exitCode=0 Mar 10 00:23:45 crc kubenswrapper[4906]: I0310 00:23:45.577386 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"bbf3660c-b068-476f-94c3-8a55f1ce1679","Type":"ContainerDied","Data":"2694aac51bdd36221d0b9b54620fa10bc929401b4275e1a82e79d533f08595e3"} Mar 10 00:23:45 crc kubenswrapper[4906]: I0310 00:23:45.628936 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_bbf3660c-b068-476f-94c3-8a55f1ce1679/manage-dockerfile/0.log" Mar 10 00:23:46 crc kubenswrapper[4906]: I0310 00:23:46.589422 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"bbf3660c-b068-476f-94c3-8a55f1ce1679","Type":"ContainerStarted","Data":"ca10f88d66a7cdd5deb1780d93b44680b72272cea37198270b38336f3187433a"} Mar 10 00:23:46 crc kubenswrapper[4906]: I0310 00:23:46.623151 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=5.623132983 podStartE2EDuration="5.623132983s" podCreationTimestamp="2026-03-10 00:23:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:23:46.621069254 +0000 UTC m=+1052.768964366" watchObservedRunningTime="2026-03-10 00:23:46.623132983 +0000 UTC m=+1052.771028105" Mar 10 00:24:00 crc kubenswrapper[4906]: I0310 00:24:00.141814 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551704-cctxp"] Mar 10 00:24:00 crc kubenswrapper[4906]: I0310 00:24:00.143880 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551704-cctxp" Mar 10 00:24:00 crc kubenswrapper[4906]: I0310 00:24:00.146688 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:24:00 crc kubenswrapper[4906]: I0310 00:24:00.146839 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:24:00 crc kubenswrapper[4906]: I0310 00:24:00.147896 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ktdr6" Mar 10 00:24:00 crc kubenswrapper[4906]: I0310 00:24:00.150744 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqcfb\" (UniqueName: \"kubernetes.io/projected/4d53a3cb-cdd6-40c8-9fc8-7b101762bac5-kube-api-access-lqcfb\") pod \"auto-csr-approver-29551704-cctxp\" (UID: \"4d53a3cb-cdd6-40c8-9fc8-7b101762bac5\") " pod="openshift-infra/auto-csr-approver-29551704-cctxp" Mar 10 00:24:00 crc kubenswrapper[4906]: I0310 00:24:00.154437 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551704-cctxp"] Mar 10 00:24:00 crc kubenswrapper[4906]: I0310 00:24:00.251877 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqcfb\" (UniqueName: \"kubernetes.io/projected/4d53a3cb-cdd6-40c8-9fc8-7b101762bac5-kube-api-access-lqcfb\") pod \"auto-csr-approver-29551704-cctxp\" (UID: \"4d53a3cb-cdd6-40c8-9fc8-7b101762bac5\") " pod="openshift-infra/auto-csr-approver-29551704-cctxp" Mar 10 00:24:00 crc kubenswrapper[4906]: I0310 00:24:00.276450 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqcfb\" (UniqueName: \"kubernetes.io/projected/4d53a3cb-cdd6-40c8-9fc8-7b101762bac5-kube-api-access-lqcfb\") pod \"auto-csr-approver-29551704-cctxp\" (UID: \"4d53a3cb-cdd6-40c8-9fc8-7b101762bac5\") " pod="openshift-infra/auto-csr-approver-29551704-cctxp" Mar 10 00:24:00 crc kubenswrapper[4906]: I0310 00:24:00.467332 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551704-cctxp" Mar 10 00:24:00 crc kubenswrapper[4906]: I0310 00:24:00.957086 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551704-cctxp"] Mar 10 00:24:01 crc kubenswrapper[4906]: I0310 00:24:01.715997 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551704-cctxp" event={"ID":"4d53a3cb-cdd6-40c8-9fc8-7b101762bac5","Type":"ContainerStarted","Data":"c06c1c878cc8eec2557855126c37a8de6e019c3dde580c8ee5a8f33dd5ea3191"} Mar 10 00:24:02 crc kubenswrapper[4906]: I0310 00:24:02.724956 4906 generic.go:334] "Generic (PLEG): container finished" podID="4d53a3cb-cdd6-40c8-9fc8-7b101762bac5" containerID="41937946e05a86d965f7df64848df0deadba1152c147422834140e5f87cae877" exitCode=0 Mar 10 00:24:02 crc kubenswrapper[4906]: I0310 00:24:02.725732 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551704-cctxp" event={"ID":"4d53a3cb-cdd6-40c8-9fc8-7b101762bac5","Type":"ContainerDied","Data":"41937946e05a86d965f7df64848df0deadba1152c147422834140e5f87cae877"} Mar 10 00:24:04 crc kubenswrapper[4906]: I0310 00:24:04.083673 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551704-cctxp" Mar 10 00:24:04 crc kubenswrapper[4906]: I0310 00:24:04.219045 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqcfb\" (UniqueName: \"kubernetes.io/projected/4d53a3cb-cdd6-40c8-9fc8-7b101762bac5-kube-api-access-lqcfb\") pod \"4d53a3cb-cdd6-40c8-9fc8-7b101762bac5\" (UID: \"4d53a3cb-cdd6-40c8-9fc8-7b101762bac5\") " Mar 10 00:24:04 crc kubenswrapper[4906]: I0310 00:24:04.228745 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d53a3cb-cdd6-40c8-9fc8-7b101762bac5-kube-api-access-lqcfb" (OuterVolumeSpecName: "kube-api-access-lqcfb") pod "4d53a3cb-cdd6-40c8-9fc8-7b101762bac5" (UID: "4d53a3cb-cdd6-40c8-9fc8-7b101762bac5"). InnerVolumeSpecName "kube-api-access-lqcfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:24:04 crc kubenswrapper[4906]: I0310 00:24:04.321141 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqcfb\" (UniqueName: \"kubernetes.io/projected/4d53a3cb-cdd6-40c8-9fc8-7b101762bac5-kube-api-access-lqcfb\") on node \"crc\" DevicePath \"\"" Mar 10 00:24:04 crc kubenswrapper[4906]: I0310 00:24:04.742693 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551704-cctxp" event={"ID":"4d53a3cb-cdd6-40c8-9fc8-7b101762bac5","Type":"ContainerDied","Data":"c06c1c878cc8eec2557855126c37a8de6e019c3dde580c8ee5a8f33dd5ea3191"} Mar 10 00:24:04 crc kubenswrapper[4906]: I0310 00:24:04.742736 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c06c1c878cc8eec2557855126c37a8de6e019c3dde580c8ee5a8f33dd5ea3191" Mar 10 00:24:04 crc kubenswrapper[4906]: I0310 00:24:04.742783 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551704-cctxp" Mar 10 00:24:05 crc kubenswrapper[4906]: I0310 00:24:05.185847 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551698-z86s5"] Mar 10 00:24:05 crc kubenswrapper[4906]: I0310 00:24:05.199207 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551698-z86s5"] Mar 10 00:24:06 crc kubenswrapper[4906]: I0310 00:24:06.598776 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3f8803e-330e-4afb-af29-1c613251cd1c" path="/var/lib/kubelet/pods/b3f8803e-330e-4afb-af29-1c613251cd1c/volumes" Mar 10 00:24:21 crc kubenswrapper[4906]: I0310 00:24:21.860437 4906 scope.go:117] "RemoveContainer" containerID="32ed69827b44bfbe64167bcb25bdcb84d7ef40d25d70f4386d48e7176763354b" Mar 10 00:25:00 crc kubenswrapper[4906]: I0310 00:25:00.502348 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:25:00 crc kubenswrapper[4906]: I0310 00:25:00.503021 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:25:30 crc kubenswrapper[4906]: I0310 00:25:30.502842 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:25:30 crc kubenswrapper[4906]: I0310 00:25:30.503313 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.136597 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551706-pmx5k"] Mar 10 00:26:00 crc kubenswrapper[4906]: E0310 00:26:00.137357 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d53a3cb-cdd6-40c8-9fc8-7b101762bac5" containerName="oc" Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.137371 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d53a3cb-cdd6-40c8-9fc8-7b101762bac5" containerName="oc" Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.137513 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d53a3cb-cdd6-40c8-9fc8-7b101762bac5" containerName="oc" Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.138015 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551706-pmx5k" Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.141802 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.142062 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ktdr6" Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.142196 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.144408 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551706-pmx5k"] Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.249344 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r552p\" (UniqueName: \"kubernetes.io/projected/3a884931-579c-4774-83da-f61f0bf930c8-kube-api-access-r552p\") pod \"auto-csr-approver-29551706-pmx5k\" (UID: \"3a884931-579c-4774-83da-f61f0bf930c8\") " pod="openshift-infra/auto-csr-approver-29551706-pmx5k" Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.350569 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r552p\" (UniqueName: \"kubernetes.io/projected/3a884931-579c-4774-83da-f61f0bf930c8-kube-api-access-r552p\") pod \"auto-csr-approver-29551706-pmx5k\" (UID: \"3a884931-579c-4774-83da-f61f0bf930c8\") " pod="openshift-infra/auto-csr-approver-29551706-pmx5k" Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.368610 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r552p\" (UniqueName: \"kubernetes.io/projected/3a884931-579c-4774-83da-f61f0bf930c8-kube-api-access-r552p\") pod \"auto-csr-approver-29551706-pmx5k\" (UID: \"3a884931-579c-4774-83da-f61f0bf930c8\") " pod="openshift-infra/auto-csr-approver-29551706-pmx5k" Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.458552 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551706-pmx5k" Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.502930 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.503159 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.503294 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.504553 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ecbb89ce657e5a301b803c163bbd70823102707b301c704efb62181f5820ef4b"} pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.504754 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" containerID="cri-o://ecbb89ce657e5a301b803c163bbd70823102707b301c704efb62181f5820ef4b" gracePeriod=600 Mar 10 00:26:00 crc kubenswrapper[4906]: I0310 00:26:00.853346 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551706-pmx5k"] Mar 10 00:26:00 crc kubenswrapper[4906]: W0310 00:26:00.860687 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a884931_579c_4774_83da_f61f0bf930c8.slice/crio-20449e2c6a85b8424e946696969dba66416e67dfe4e9910d120ee117e830dfe0 WatchSource:0}: Error finding container 20449e2c6a85b8424e946696969dba66416e67dfe4e9910d120ee117e830dfe0: Status 404 returned error can't find the container with id 20449e2c6a85b8424e946696969dba66416e67dfe4e9910d120ee117e830dfe0 Mar 10 00:26:01 crc kubenswrapper[4906]: I0310 00:26:01.178323 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551706-pmx5k" event={"ID":"3a884931-579c-4774-83da-f61f0bf930c8","Type":"ContainerStarted","Data":"20449e2c6a85b8424e946696969dba66416e67dfe4e9910d120ee117e830dfe0"} Mar 10 00:26:01 crc kubenswrapper[4906]: I0310 00:26:01.180702 4906 generic.go:334] "Generic (PLEG): container finished" podID="72d61d35-0a64-45a5-8df3-9c429727deba" containerID="ecbb89ce657e5a301b803c163bbd70823102707b301c704efb62181f5820ef4b" exitCode=0 Mar 10 00:26:01 crc kubenswrapper[4906]: I0310 00:26:01.180741 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerDied","Data":"ecbb89ce657e5a301b803c163bbd70823102707b301c704efb62181f5820ef4b"} Mar 10 00:26:01 crc kubenswrapper[4906]: I0310 00:26:01.180766 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerStarted","Data":"6847583fc3b7bdeec69f6786020e94f393a41e01c5039c9b2618c4b51a1b7db5"} Mar 10 00:26:01 crc kubenswrapper[4906]: I0310 00:26:01.180780 4906 scope.go:117] "RemoveContainer" containerID="90546ce32412324c82386f9bf379f32c5b7b1738d24c8c80aab36b837d2ad814" Mar 10 00:26:03 crc kubenswrapper[4906]: I0310 00:26:03.196932 4906 generic.go:334] "Generic (PLEG): container finished" podID="3a884931-579c-4774-83da-f61f0bf930c8" containerID="c4564953997dd352418213cb088ceb3f283d94aae07ff89e5ce9e44357d4b6d6" exitCode=0 Mar 10 00:26:03 crc kubenswrapper[4906]: I0310 00:26:03.197034 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551706-pmx5k" event={"ID":"3a884931-579c-4774-83da-f61f0bf930c8","Type":"ContainerDied","Data":"c4564953997dd352418213cb088ceb3f283d94aae07ff89e5ce9e44357d4b6d6"} Mar 10 00:26:04 crc kubenswrapper[4906]: I0310 00:26:04.474414 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551706-pmx5k" Mar 10 00:26:04 crc kubenswrapper[4906]: I0310 00:26:04.505519 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r552p\" (UniqueName: \"kubernetes.io/projected/3a884931-579c-4774-83da-f61f0bf930c8-kube-api-access-r552p\") pod \"3a884931-579c-4774-83da-f61f0bf930c8\" (UID: \"3a884931-579c-4774-83da-f61f0bf930c8\") " Mar 10 00:26:04 crc kubenswrapper[4906]: I0310 00:26:04.532907 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a884931-579c-4774-83da-f61f0bf930c8-kube-api-access-r552p" (OuterVolumeSpecName: "kube-api-access-r552p") pod "3a884931-579c-4774-83da-f61f0bf930c8" (UID: "3a884931-579c-4774-83da-f61f0bf930c8"). InnerVolumeSpecName "kube-api-access-r552p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:26:04 crc kubenswrapper[4906]: I0310 00:26:04.606834 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r552p\" (UniqueName: \"kubernetes.io/projected/3a884931-579c-4774-83da-f61f0bf930c8-kube-api-access-r552p\") on node \"crc\" DevicePath \"\"" Mar 10 00:26:05 crc kubenswrapper[4906]: I0310 00:26:05.213115 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551706-pmx5k" event={"ID":"3a884931-579c-4774-83da-f61f0bf930c8","Type":"ContainerDied","Data":"20449e2c6a85b8424e946696969dba66416e67dfe4e9910d120ee117e830dfe0"} Mar 10 00:26:05 crc kubenswrapper[4906]: I0310 00:26:05.213159 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20449e2c6a85b8424e946696969dba66416e67dfe4e9910d120ee117e830dfe0" Mar 10 00:26:05 crc kubenswrapper[4906]: I0310 00:26:05.213213 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551706-pmx5k" Mar 10 00:26:05 crc kubenswrapper[4906]: I0310 00:26:05.556100 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551700-8txcl"] Mar 10 00:26:05 crc kubenswrapper[4906]: I0310 00:26:05.565507 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551700-8txcl"] Mar 10 00:26:06 crc kubenswrapper[4906]: I0310 00:26:06.589909 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f0a933-9b21-4d0b-8f8f-29e4c93f0336" path="/var/lib/kubelet/pods/f4f0a933-9b21-4d0b-8f8f-29e4c93f0336/volumes" Mar 10 00:26:21 crc kubenswrapper[4906]: I0310 00:26:21.945800 4906 scope.go:117] "RemoveContainer" containerID="75a02f0a09d6cfec5c5dfae1f7359fc71b1fb3157d79395804cd5e88e7b94a5d" Mar 10 00:26:25 crc kubenswrapper[4906]: I0310 00:26:25.677417 4906 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded" start-of-body= Mar 10 00:26:25 crc kubenswrapper[4906]: I0310 00:26:25.677468 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded" Mar 10 00:27:21 crc kubenswrapper[4906]: I0310 00:27:21.066275 4906 generic.go:334] "Generic (PLEG): container finished" podID="bbf3660c-b068-476f-94c3-8a55f1ce1679" containerID="ca10f88d66a7cdd5deb1780d93b44680b72272cea37198270b38336f3187433a" exitCode=0 Mar 10 00:27:21 crc kubenswrapper[4906]: I0310 00:27:21.066402 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"bbf3660c-b068-476f-94c3-8a55f1ce1679","Type":"ContainerDied","Data":"ca10f88d66a7cdd5deb1780d93b44680b72272cea37198270b38336f3187433a"} Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.420676 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.505378 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bbf3660c-b068-476f-94c3-8a55f1ce1679-node-pullsecrets\") pod \"bbf3660c-b068-476f-94c3-8a55f1ce1679\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.505436 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-blob-cache\") pod \"bbf3660c-b068-476f-94c3-8a55f1ce1679\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.505469 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/bbf3660c-b068-476f-94c3-8a55f1ce1679-builder-dockercfg-7plhd-pull\") pod \"bbf3660c-b068-476f-94c3-8a55f1ce1679\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.505500 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bbf3660c-b068-476f-94c3-8a55f1ce1679-buildcachedir\") pod \"bbf3660c-b068-476f-94c3-8a55f1ce1679\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.505527 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-container-storage-root\") pod \"bbf3660c-b068-476f-94c3-8a55f1ce1679\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.505553 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-ca-bundles\") pod \"bbf3660c-b068-476f-94c3-8a55f1ce1679\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.505614 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-buildworkdir\") pod \"bbf3660c-b068-476f-94c3-8a55f1ce1679\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.505674 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-container-storage-run\") pod \"bbf3660c-b068-476f-94c3-8a55f1ce1679\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.505713 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/bbf3660c-b068-476f-94c3-8a55f1ce1679-builder-dockercfg-7plhd-push\") pod \"bbf3660c-b068-476f-94c3-8a55f1ce1679\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.505705 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bbf3660c-b068-476f-94c3-8a55f1ce1679-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "bbf3660c-b068-476f-94c3-8a55f1ce1679" (UID: "bbf3660c-b068-476f-94c3-8a55f1ce1679"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.505768 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-system-configs\") pod \"bbf3660c-b068-476f-94c3-8a55f1ce1679\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.505826 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-proxy-ca-bundles\") pod \"bbf3660c-b068-476f-94c3-8a55f1ce1679\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.505851 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dck8j\" (UniqueName: \"kubernetes.io/projected/bbf3660c-b068-476f-94c3-8a55f1ce1679-kube-api-access-dck8j\") pod \"bbf3660c-b068-476f-94c3-8a55f1ce1679\" (UID: \"bbf3660c-b068-476f-94c3-8a55f1ce1679\") " Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.506104 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bbf3660c-b068-476f-94c3-8a55f1ce1679-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.506736 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bbf3660c-b068-476f-94c3-8a55f1ce1679-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "bbf3660c-b068-476f-94c3-8a55f1ce1679" (UID: "bbf3660c-b068-476f-94c3-8a55f1ce1679"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.507168 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "bbf3660c-b068-476f-94c3-8a55f1ce1679" (UID: "bbf3660c-b068-476f-94c3-8a55f1ce1679"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.507387 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "bbf3660c-b068-476f-94c3-8a55f1ce1679" (UID: "bbf3660c-b068-476f-94c3-8a55f1ce1679"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.507452 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "bbf3660c-b068-476f-94c3-8a55f1ce1679" (UID: "bbf3660c-b068-476f-94c3-8a55f1ce1679"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.508326 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "bbf3660c-b068-476f-94c3-8a55f1ce1679" (UID: "bbf3660c-b068-476f-94c3-8a55f1ce1679"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.515844 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf3660c-b068-476f-94c3-8a55f1ce1679-builder-dockercfg-7plhd-pull" (OuterVolumeSpecName: "builder-dockercfg-7plhd-pull") pod "bbf3660c-b068-476f-94c3-8a55f1ce1679" (UID: "bbf3660c-b068-476f-94c3-8a55f1ce1679"). InnerVolumeSpecName "builder-dockercfg-7plhd-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.515889 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf3660c-b068-476f-94c3-8a55f1ce1679-kube-api-access-dck8j" (OuterVolumeSpecName: "kube-api-access-dck8j") pod "bbf3660c-b068-476f-94c3-8a55f1ce1679" (UID: "bbf3660c-b068-476f-94c3-8a55f1ce1679"). InnerVolumeSpecName "kube-api-access-dck8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.516101 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf3660c-b068-476f-94c3-8a55f1ce1679-builder-dockercfg-7plhd-push" (OuterVolumeSpecName: "builder-dockercfg-7plhd-push") pod "bbf3660c-b068-476f-94c3-8a55f1ce1679" (UID: "bbf3660c-b068-476f-94c3-8a55f1ce1679"). InnerVolumeSpecName "builder-dockercfg-7plhd-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.526002 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "bbf3660c-b068-476f-94c3-8a55f1ce1679" (UID: "bbf3660c-b068-476f-94c3-8a55f1ce1679"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.607416 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.607452 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dck8j\" (UniqueName: \"kubernetes.io/projected/bbf3660c-b068-476f-94c3-8a55f1ce1679-kube-api-access-dck8j\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.607466 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bbf3660c-b068-476f-94c3-8a55f1ce1679-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.607480 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/bbf3660c-b068-476f-94c3-8a55f1ce1679-builder-dockercfg-7plhd-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.607492 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.607505 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.607517 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.607528 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/bbf3660c-b068-476f-94c3-8a55f1ce1679-builder-dockercfg-7plhd-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:22 crc kubenswrapper[4906]: I0310 00:27:22.607539 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:23 crc kubenswrapper[4906]: I0310 00:27:23.081654 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"bbf3660c-b068-476f-94c3-8a55f1ce1679","Type":"ContainerDied","Data":"d5169b29240101e3adb46b4bdfb045e26096cd4ecaba9d2f834d33b5a2dc9552"} Mar 10 00:27:23 crc kubenswrapper[4906]: I0310 00:27:23.081690 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5169b29240101e3adb46b4bdfb045e26096cd4ecaba9d2f834d33b5a2dc9552" Mar 10 00:27:23 crc kubenswrapper[4906]: I0310 00:27:23.081770 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 10 00:27:23 crc kubenswrapper[4906]: I0310 00:27:23.402203 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "bbf3660c-b068-476f-94c3-8a55f1ce1679" (UID: "bbf3660c-b068-476f-94c3-8a55f1ce1679"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:27:23 crc kubenswrapper[4906]: I0310 00:27:23.420056 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:25 crc kubenswrapper[4906]: I0310 00:27:25.843791 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "bbf3660c-b068-476f-94c3-8a55f1ce1679" (UID: "bbf3660c-b068-476f-94c3-8a55f1ce1679"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:27:25 crc kubenswrapper[4906]: I0310 00:27:25.858661 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bbf3660c-b068-476f-94c3-8a55f1ce1679-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.847010 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:27:26 crc kubenswrapper[4906]: E0310 00:27:26.847487 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf3660c-b068-476f-94c3-8a55f1ce1679" containerName="docker-build" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.847516 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf3660c-b068-476f-94c3-8a55f1ce1679" containerName="docker-build" Mar 10 00:27:26 crc kubenswrapper[4906]: E0310 00:27:26.847541 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf3660c-b068-476f-94c3-8a55f1ce1679" containerName="manage-dockerfile" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.847555 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf3660c-b068-476f-94c3-8a55f1ce1679" containerName="manage-dockerfile" Mar 10 00:27:26 crc kubenswrapper[4906]: E0310 00:27:26.847585 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf3660c-b068-476f-94c3-8a55f1ce1679" containerName="git-clone" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.847602 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf3660c-b068-476f-94c3-8a55f1ce1679" containerName="git-clone" Mar 10 00:27:26 crc kubenswrapper[4906]: E0310 00:27:26.847630 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a884931-579c-4774-83da-f61f0bf930c8" containerName="oc" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.847685 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a884931-579c-4774-83da-f61f0bf930c8" containerName="oc" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.847919 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf3660c-b068-476f-94c3-8a55f1ce1679" containerName="docker-build" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.847954 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a884931-579c-4774-83da-f61f0bf930c8" containerName="oc" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.849329 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.851461 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.851565 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-7plhd" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.852001 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.852459 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.872176 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.974145 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/0bd90c3d-cb4d-4834-af5e-0342eb26d553-builder-dockercfg-7plhd-pull\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.974212 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.974283 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.974364 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0bd90c3d-cb4d-4834-af5e-0342eb26d553-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.974516 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.974621 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.974788 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.974889 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.975043 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.975128 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xmxd\" (UniqueName: \"kubernetes.io/projected/0bd90c3d-cb4d-4834-af5e-0342eb26d553-kube-api-access-5xmxd\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.975164 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0bd90c3d-cb4d-4834-af5e-0342eb26d553-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:26 crc kubenswrapper[4906]: I0310 00:27:26.975194 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/0bd90c3d-cb4d-4834-af5e-0342eb26d553-builder-dockercfg-7plhd-push\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.078922 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.078993 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.079038 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0bd90c3d-cb4d-4834-af5e-0342eb26d553-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.079190 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.079262 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.079313 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.079368 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.079406 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.079445 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xmxd\" (UniqueName: \"kubernetes.io/projected/0bd90c3d-cb4d-4834-af5e-0342eb26d553-kube-api-access-5xmxd\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.079488 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0bd90c3d-cb4d-4834-af5e-0342eb26d553-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.079519 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/0bd90c3d-cb4d-4834-af5e-0342eb26d553-builder-dockercfg-7plhd-push\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.079611 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/0bd90c3d-cb4d-4834-af5e-0342eb26d553-builder-dockercfg-7plhd-pull\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.086102 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/0bd90c3d-cb4d-4834-af5e-0342eb26d553-builder-dockercfg-7plhd-pull\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.086580 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.087481 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.087696 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0bd90c3d-cb4d-4834-af5e-0342eb26d553-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.088042 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.089043 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.089413 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.089578 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0bd90c3d-cb4d-4834-af5e-0342eb26d553-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.090009 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.090517 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.101353 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/0bd90c3d-cb4d-4834-af5e-0342eb26d553-builder-dockercfg-7plhd-push\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.117180 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xmxd\" (UniqueName: \"kubernetes.io/projected/0bd90c3d-cb4d-4834-af5e-0342eb26d553-kube-api-access-5xmxd\") pod \"sg-bridge-1-build\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.166159 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:27 crc kubenswrapper[4906]: I0310 00:27:27.386121 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:27:28 crc kubenswrapper[4906]: I0310 00:27:28.128479 4906 generic.go:334] "Generic (PLEG): container finished" podID="0bd90c3d-cb4d-4834-af5e-0342eb26d553" containerID="92b171299d2ef5e48eeccb6789efb415f5892340d4770ec912aef8d2844b4fa4" exitCode=0 Mar 10 00:27:28 crc kubenswrapper[4906]: I0310 00:27:28.128585 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"0bd90c3d-cb4d-4834-af5e-0342eb26d553","Type":"ContainerDied","Data":"92b171299d2ef5e48eeccb6789efb415f5892340d4770ec912aef8d2844b4fa4"} Mar 10 00:27:28 crc kubenswrapper[4906]: I0310 00:27:28.128878 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"0bd90c3d-cb4d-4834-af5e-0342eb26d553","Type":"ContainerStarted","Data":"aa795a0586ecfa332714b0eb0a6ec1d09812d4d9db63e0ed79db718ff6202b23"} Mar 10 00:27:29 crc kubenswrapper[4906]: I0310 00:27:29.144216 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"0bd90c3d-cb4d-4834-af5e-0342eb26d553","Type":"ContainerStarted","Data":"7f7e4ecad566630fe6adcd2e8388d087436c672f8989c4cb49b3fe3637eafc12"} Mar 10 00:27:29 crc kubenswrapper[4906]: I0310 00:27:29.186481 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=3.186450842 podStartE2EDuration="3.186450842s" podCreationTimestamp="2026-03-10 00:27:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:27:29.177022408 +0000 UTC m=+1275.324917560" watchObservedRunningTime="2026-03-10 00:27:29.186450842 +0000 UTC m=+1275.334345984" Mar 10 00:27:36 crc kubenswrapper[4906]: I0310 00:27:36.217128 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_0bd90c3d-cb4d-4834-af5e-0342eb26d553/docker-build/0.log" Mar 10 00:27:36 crc kubenswrapper[4906]: I0310 00:27:36.218021 4906 generic.go:334] "Generic (PLEG): container finished" podID="0bd90c3d-cb4d-4834-af5e-0342eb26d553" containerID="7f7e4ecad566630fe6adcd2e8388d087436c672f8989c4cb49b3fe3637eafc12" exitCode=1 Mar 10 00:27:36 crc kubenswrapper[4906]: I0310 00:27:36.218069 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"0bd90c3d-cb4d-4834-af5e-0342eb26d553","Type":"ContainerDied","Data":"7f7e4ecad566630fe6adcd2e8388d087436c672f8989c4cb49b3fe3637eafc12"} Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.161891 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.468786 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_0bd90c3d-cb4d-4834-af5e-0342eb26d553/docker-build/0.log" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.469402 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.629442 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/0bd90c3d-cb4d-4834-af5e-0342eb26d553-builder-dockercfg-7plhd-push\") pod \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.629515 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-ca-bundles\") pod \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.629704 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0bd90c3d-cb4d-4834-af5e-0342eb26d553-node-pullsecrets\") pod \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.629758 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-blob-cache\") pod \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.629836 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0bd90c3d-cb4d-4834-af5e-0342eb26d553-buildcachedir\") pod \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.629888 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/0bd90c3d-cb4d-4834-af5e-0342eb26d553-builder-dockercfg-7plhd-pull\") pod \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.629886 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bd90c3d-cb4d-4834-af5e-0342eb26d553-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "0bd90c3d-cb4d-4834-af5e-0342eb26d553" (UID: "0bd90c3d-cb4d-4834-af5e-0342eb26d553"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.629928 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-container-storage-run\") pod \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.630025 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bd90c3d-cb4d-4834-af5e-0342eb26d553-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "0bd90c3d-cb4d-4834-af5e-0342eb26d553" (UID: "0bd90c3d-cb4d-4834-af5e-0342eb26d553"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.630065 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-container-storage-root\") pod \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.630154 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xmxd\" (UniqueName: \"kubernetes.io/projected/0bd90c3d-cb4d-4834-af5e-0342eb26d553-kube-api-access-5xmxd\") pod \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.630209 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-buildworkdir\") pod \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.630248 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-proxy-ca-bundles\") pod \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.630291 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-system-configs\") pod \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\" (UID: \"0bd90c3d-cb4d-4834-af5e-0342eb26d553\") " Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.630841 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0bd90c3d-cb4d-4834-af5e-0342eb26d553-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.630858 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0bd90c3d-cb4d-4834-af5e-0342eb26d553-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.631179 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "0bd90c3d-cb4d-4834-af5e-0342eb26d553" (UID: "0bd90c3d-cb4d-4834-af5e-0342eb26d553"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.631289 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "0bd90c3d-cb4d-4834-af5e-0342eb26d553" (UID: "0bd90c3d-cb4d-4834-af5e-0342eb26d553"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.631414 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "0bd90c3d-cb4d-4834-af5e-0342eb26d553" (UID: "0bd90c3d-cb4d-4834-af5e-0342eb26d553"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.631617 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "0bd90c3d-cb4d-4834-af5e-0342eb26d553" (UID: "0bd90c3d-cb4d-4834-af5e-0342eb26d553"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.632004 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "0bd90c3d-cb4d-4834-af5e-0342eb26d553" (UID: "0bd90c3d-cb4d-4834-af5e-0342eb26d553"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.636417 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bd90c3d-cb4d-4834-af5e-0342eb26d553-builder-dockercfg-7plhd-pull" (OuterVolumeSpecName: "builder-dockercfg-7plhd-pull") pod "0bd90c3d-cb4d-4834-af5e-0342eb26d553" (UID: "0bd90c3d-cb4d-4834-af5e-0342eb26d553"). InnerVolumeSpecName "builder-dockercfg-7plhd-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.637329 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bd90c3d-cb4d-4834-af5e-0342eb26d553-builder-dockercfg-7plhd-push" (OuterVolumeSpecName: "builder-dockercfg-7plhd-push") pod "0bd90c3d-cb4d-4834-af5e-0342eb26d553" (UID: "0bd90c3d-cb4d-4834-af5e-0342eb26d553"). InnerVolumeSpecName "builder-dockercfg-7plhd-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.638913 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bd90c3d-cb4d-4834-af5e-0342eb26d553-kube-api-access-5xmxd" (OuterVolumeSpecName: "kube-api-access-5xmxd") pod "0bd90c3d-cb4d-4834-af5e-0342eb26d553" (UID: "0bd90c3d-cb4d-4834-af5e-0342eb26d553"). InnerVolumeSpecName "kube-api-access-5xmxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.732925 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/0bd90c3d-cb4d-4834-af5e-0342eb26d553-builder-dockercfg-7plhd-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.732967 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.732980 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xmxd\" (UniqueName: \"kubernetes.io/projected/0bd90c3d-cb4d-4834-af5e-0342eb26d553-kube-api-access-5xmxd\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.732993 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.733006 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.733017 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.733028 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.733039 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/0bd90c3d-cb4d-4834-af5e-0342eb26d553-builder-dockercfg-7plhd-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.745492 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "0bd90c3d-cb4d-4834-af5e-0342eb26d553" (UID: "0bd90c3d-cb4d-4834-af5e-0342eb26d553"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.834290 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:37 crc kubenswrapper[4906]: I0310 00:27:37.994120 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "0bd90c3d-cb4d-4834-af5e-0342eb26d553" (UID: "0bd90c3d-cb4d-4834-af5e-0342eb26d553"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.037495 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0bd90c3d-cb4d-4834-af5e-0342eb26d553-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.235977 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_0bd90c3d-cb4d-4834-af5e-0342eb26d553/docker-build/0.log" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.236544 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"0bd90c3d-cb4d-4834-af5e-0342eb26d553","Type":"ContainerDied","Data":"aa795a0586ecfa332714b0eb0a6ec1d09812d4d9db63e0ed79db718ff6202b23"} Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.236591 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa795a0586ecfa332714b0eb0a6ec1d09812d4d9db63e0ed79db718ff6202b23" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.236687 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.284551 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.290071 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.591990 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bd90c3d-cb4d-4834-af5e-0342eb26d553" path="/var/lib/kubelet/pods/0bd90c3d-cb4d-4834-af5e-0342eb26d553/volumes" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.754964 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 10 00:27:38 crc kubenswrapper[4906]: E0310 00:27:38.755219 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bd90c3d-cb4d-4834-af5e-0342eb26d553" containerName="manage-dockerfile" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.755233 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd90c3d-cb4d-4834-af5e-0342eb26d553" containerName="manage-dockerfile" Mar 10 00:27:38 crc kubenswrapper[4906]: E0310 00:27:38.755253 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bd90c3d-cb4d-4834-af5e-0342eb26d553" containerName="docker-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.755260 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd90c3d-cb4d-4834-af5e-0342eb26d553" containerName="docker-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.755363 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bd90c3d-cb4d-4834-af5e-0342eb26d553" containerName="docker-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.756141 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.757956 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.758570 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.758963 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.759188 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-7plhd" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.779456 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.847971 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.848025 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.848163 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.848293 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3e366a01-d784-4d60-8e9f-f20451001fbc-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.848349 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtmt6\" (UniqueName: \"kubernetes.io/projected/3e366a01-d784-4d60-8e9f-f20451001fbc-kube-api-access-rtmt6\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.848387 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3e366a01-d784-4d60-8e9f-f20451001fbc-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.848512 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.848577 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.848612 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/3e366a01-d784-4d60-8e9f-f20451001fbc-builder-dockercfg-7plhd-push\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.848764 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.848811 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.848856 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/3e366a01-d784-4d60-8e9f-f20451001fbc-builder-dockercfg-7plhd-pull\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.949800 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.949841 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/3e366a01-d784-4d60-8e9f-f20451001fbc-builder-dockercfg-7plhd-push\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.949879 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.949894 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.949914 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/3e366a01-d784-4d60-8e9f-f20451001fbc-builder-dockercfg-7plhd-pull\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.949932 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.949953 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.949980 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.950002 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3e366a01-d784-4d60-8e9f-f20451001fbc-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.950033 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtmt6\" (UniqueName: \"kubernetes.io/projected/3e366a01-d784-4d60-8e9f-f20451001fbc-kube-api-access-rtmt6\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.950056 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3e366a01-d784-4d60-8e9f-f20451001fbc-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.950086 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.950361 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.950421 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3e366a01-d784-4d60-8e9f-f20451001fbc-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.950471 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3e366a01-d784-4d60-8e9f-f20451001fbc-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.950494 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.950546 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.950838 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.951001 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.951249 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.951353 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.954999 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/3e366a01-d784-4d60-8e9f-f20451001fbc-builder-dockercfg-7plhd-push\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.957313 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/3e366a01-d784-4d60-8e9f-f20451001fbc-builder-dockercfg-7plhd-pull\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:38 crc kubenswrapper[4906]: I0310 00:27:38.967745 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtmt6\" (UniqueName: \"kubernetes.io/projected/3e366a01-d784-4d60-8e9f-f20451001fbc-kube-api-access-rtmt6\") pod \"sg-bridge-2-build\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:39 crc kubenswrapper[4906]: I0310 00:27:39.077807 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 10 00:27:39 crc kubenswrapper[4906]: I0310 00:27:39.288307 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 10 00:27:40 crc kubenswrapper[4906]: I0310 00:27:40.252836 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"3e366a01-d784-4d60-8e9f-f20451001fbc","Type":"ContainerStarted","Data":"17fb7a07a86bc8c45e2381061fb7c907bf2da02bcfd3928f081be95cc88fe619"} Mar 10 00:27:40 crc kubenswrapper[4906]: I0310 00:27:40.252879 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"3e366a01-d784-4d60-8e9f-f20451001fbc","Type":"ContainerStarted","Data":"f19e13ba6f2a5425d2374bc44104422e5adbbf3d6b7d46f0ccffd80613f48c1c"} Mar 10 00:27:41 crc kubenswrapper[4906]: I0310 00:27:41.261402 4906 generic.go:334] "Generic (PLEG): container finished" podID="3e366a01-d784-4d60-8e9f-f20451001fbc" containerID="17fb7a07a86bc8c45e2381061fb7c907bf2da02bcfd3928f081be95cc88fe619" exitCode=0 Mar 10 00:27:41 crc kubenswrapper[4906]: I0310 00:27:41.261448 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"3e366a01-d784-4d60-8e9f-f20451001fbc","Type":"ContainerDied","Data":"17fb7a07a86bc8c45e2381061fb7c907bf2da02bcfd3928f081be95cc88fe619"} Mar 10 00:27:42 crc kubenswrapper[4906]: I0310 00:27:42.268857 4906 generic.go:334] "Generic (PLEG): container finished" podID="3e366a01-d784-4d60-8e9f-f20451001fbc" containerID="54149cf662fff2f5d5a067c26e4c563ce1994fa9cc03005269942d14377da159" exitCode=0 Mar 10 00:27:42 crc kubenswrapper[4906]: I0310 00:27:42.269058 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"3e366a01-d784-4d60-8e9f-f20451001fbc","Type":"ContainerDied","Data":"54149cf662fff2f5d5a067c26e4c563ce1994fa9cc03005269942d14377da159"} Mar 10 00:27:42 crc kubenswrapper[4906]: I0310 00:27:42.295240 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_3e366a01-d784-4d60-8e9f-f20451001fbc/manage-dockerfile/0.log" Mar 10 00:27:43 crc kubenswrapper[4906]: I0310 00:27:43.280307 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"3e366a01-d784-4d60-8e9f-f20451001fbc","Type":"ContainerStarted","Data":"7e0bb349735950deba4e263f335935fec2ae8dea8a7b2ca8ccc953a203dd60f0"} Mar 10 00:27:43 crc kubenswrapper[4906]: I0310 00:27:43.314865 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-2-build" podStartSLOduration=5.31483926 podStartE2EDuration="5.31483926s" podCreationTimestamp="2026-03-10 00:27:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:27:43.302588877 +0000 UTC m=+1289.450483999" watchObservedRunningTime="2026-03-10 00:27:43.31483926 +0000 UTC m=+1289.462734402" Mar 10 00:28:00 crc kubenswrapper[4906]: I0310 00:28:00.161953 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551708-2hp2g"] Mar 10 00:28:00 crc kubenswrapper[4906]: I0310 00:28:00.163363 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551708-2hp2g" Mar 10 00:28:00 crc kubenswrapper[4906]: I0310 00:28:00.166506 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:28:00 crc kubenswrapper[4906]: I0310 00:28:00.166619 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ktdr6" Mar 10 00:28:00 crc kubenswrapper[4906]: I0310 00:28:00.166807 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:28:00 crc kubenswrapper[4906]: I0310 00:28:00.183304 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551708-2hp2g"] Mar 10 00:28:00 crc kubenswrapper[4906]: I0310 00:28:00.265211 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgvm9\" (UniqueName: \"kubernetes.io/projected/d5f5c4bb-bcea-4d6a-8732-a69f6d373952-kube-api-access-sgvm9\") pod \"auto-csr-approver-29551708-2hp2g\" (UID: \"d5f5c4bb-bcea-4d6a-8732-a69f6d373952\") " pod="openshift-infra/auto-csr-approver-29551708-2hp2g" Mar 10 00:28:00 crc kubenswrapper[4906]: I0310 00:28:00.366851 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgvm9\" (UniqueName: \"kubernetes.io/projected/d5f5c4bb-bcea-4d6a-8732-a69f6d373952-kube-api-access-sgvm9\") pod \"auto-csr-approver-29551708-2hp2g\" (UID: \"d5f5c4bb-bcea-4d6a-8732-a69f6d373952\") " pod="openshift-infra/auto-csr-approver-29551708-2hp2g" Mar 10 00:28:00 crc kubenswrapper[4906]: I0310 00:28:00.399916 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgvm9\" (UniqueName: \"kubernetes.io/projected/d5f5c4bb-bcea-4d6a-8732-a69f6d373952-kube-api-access-sgvm9\") pod \"auto-csr-approver-29551708-2hp2g\" (UID: \"d5f5c4bb-bcea-4d6a-8732-a69f6d373952\") " pod="openshift-infra/auto-csr-approver-29551708-2hp2g" Mar 10 00:28:00 crc kubenswrapper[4906]: I0310 00:28:00.499157 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551708-2hp2g" Mar 10 00:28:00 crc kubenswrapper[4906]: I0310 00:28:00.502312 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:28:00 crc kubenswrapper[4906]: I0310 00:28:00.502359 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:28:00 crc kubenswrapper[4906]: I0310 00:28:00.747783 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551708-2hp2g"] Mar 10 00:28:00 crc kubenswrapper[4906]: I0310 00:28:00.761891 4906 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:28:01 crc kubenswrapper[4906]: I0310 00:28:01.415898 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551708-2hp2g" event={"ID":"d5f5c4bb-bcea-4d6a-8732-a69f6d373952","Type":"ContainerStarted","Data":"04377a8fd0d56f977709570b1e0a76f14babfa2f3d94c68594f99ee245a1c0ce"} Mar 10 00:28:02 crc kubenswrapper[4906]: I0310 00:28:02.422395 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551708-2hp2g" event={"ID":"d5f5c4bb-bcea-4d6a-8732-a69f6d373952","Type":"ContainerStarted","Data":"80f5294adf972196dc5188a3767901e794ef66f50e16a446c09bbf7826241001"} Mar 10 00:28:03 crc kubenswrapper[4906]: I0310 00:28:03.430149 4906 generic.go:334] "Generic (PLEG): container finished" podID="d5f5c4bb-bcea-4d6a-8732-a69f6d373952" containerID="80f5294adf972196dc5188a3767901e794ef66f50e16a446c09bbf7826241001" exitCode=0 Mar 10 00:28:03 crc kubenswrapper[4906]: I0310 00:28:03.430239 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551708-2hp2g" event={"ID":"d5f5c4bb-bcea-4d6a-8732-a69f6d373952","Type":"ContainerDied","Data":"80f5294adf972196dc5188a3767901e794ef66f50e16a446c09bbf7826241001"} Mar 10 00:28:04 crc kubenswrapper[4906]: I0310 00:28:04.764126 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551708-2hp2g" Mar 10 00:28:04 crc kubenswrapper[4906]: I0310 00:28:04.942517 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgvm9\" (UniqueName: \"kubernetes.io/projected/d5f5c4bb-bcea-4d6a-8732-a69f6d373952-kube-api-access-sgvm9\") pod \"d5f5c4bb-bcea-4d6a-8732-a69f6d373952\" (UID: \"d5f5c4bb-bcea-4d6a-8732-a69f6d373952\") " Mar 10 00:28:04 crc kubenswrapper[4906]: I0310 00:28:04.952599 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5f5c4bb-bcea-4d6a-8732-a69f6d373952-kube-api-access-sgvm9" (OuterVolumeSpecName: "kube-api-access-sgvm9") pod "d5f5c4bb-bcea-4d6a-8732-a69f6d373952" (UID: "d5f5c4bb-bcea-4d6a-8732-a69f6d373952"). InnerVolumeSpecName "kube-api-access-sgvm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:28:05 crc kubenswrapper[4906]: I0310 00:28:05.044934 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgvm9\" (UniqueName: \"kubernetes.io/projected/d5f5c4bb-bcea-4d6a-8732-a69f6d373952-kube-api-access-sgvm9\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:05 crc kubenswrapper[4906]: I0310 00:28:05.447606 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551708-2hp2g" event={"ID":"d5f5c4bb-bcea-4d6a-8732-a69f6d373952","Type":"ContainerDied","Data":"04377a8fd0d56f977709570b1e0a76f14babfa2f3d94c68594f99ee245a1c0ce"} Mar 10 00:28:05 crc kubenswrapper[4906]: I0310 00:28:05.447676 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04377a8fd0d56f977709570b1e0a76f14babfa2f3d94c68594f99ee245a1c0ce" Mar 10 00:28:05 crc kubenswrapper[4906]: I0310 00:28:05.447730 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551708-2hp2g" Mar 10 00:28:05 crc kubenswrapper[4906]: I0310 00:28:05.510217 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551702-jk7xb"] Mar 10 00:28:05 crc kubenswrapper[4906]: I0310 00:28:05.515349 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551702-jk7xb"] Mar 10 00:28:06 crc kubenswrapper[4906]: I0310 00:28:06.585537 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c19fcab-7928-4edf-a882-003c32d33473" path="/var/lib/kubelet/pods/6c19fcab-7928-4edf-a882-003c32d33473/volumes" Mar 10 00:28:22 crc kubenswrapper[4906]: I0310 00:28:22.013306 4906 scope.go:117] "RemoveContainer" containerID="3c3988e44e458523814e6fc3937ca0a4d8f2d2cf37a310c21836f4810d53f5c1" Mar 10 00:28:26 crc kubenswrapper[4906]: I0310 00:28:26.283818 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fmppw"] Mar 10 00:28:26 crc kubenswrapper[4906]: E0310 00:28:26.284524 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f5c4bb-bcea-4d6a-8732-a69f6d373952" containerName="oc" Mar 10 00:28:26 crc kubenswrapper[4906]: I0310 00:28:26.284545 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f5c4bb-bcea-4d6a-8732-a69f6d373952" containerName="oc" Mar 10 00:28:26 crc kubenswrapper[4906]: I0310 00:28:26.284777 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f5c4bb-bcea-4d6a-8732-a69f6d373952" containerName="oc" Mar 10 00:28:26 crc kubenswrapper[4906]: I0310 00:28:26.286193 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fmppw" Mar 10 00:28:26 crc kubenswrapper[4906]: I0310 00:28:26.291275 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fmppw"] Mar 10 00:28:26 crc kubenswrapper[4906]: I0310 00:28:26.440924 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cbf0261-72a9-4700-bca1-614749812f81-utilities\") pod \"certified-operators-fmppw\" (UID: \"6cbf0261-72a9-4700-bca1-614749812f81\") " pod="openshift-marketplace/certified-operators-fmppw" Mar 10 00:28:26 crc kubenswrapper[4906]: I0310 00:28:26.441052 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cbf0261-72a9-4700-bca1-614749812f81-catalog-content\") pod \"certified-operators-fmppw\" (UID: \"6cbf0261-72a9-4700-bca1-614749812f81\") " pod="openshift-marketplace/certified-operators-fmppw" Mar 10 00:28:26 crc kubenswrapper[4906]: I0310 00:28:26.441257 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsg8w\" (UniqueName: \"kubernetes.io/projected/6cbf0261-72a9-4700-bca1-614749812f81-kube-api-access-jsg8w\") pod \"certified-operators-fmppw\" (UID: \"6cbf0261-72a9-4700-bca1-614749812f81\") " pod="openshift-marketplace/certified-operators-fmppw" Mar 10 00:28:26 crc kubenswrapper[4906]: I0310 00:28:26.543177 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cbf0261-72a9-4700-bca1-614749812f81-catalog-content\") pod \"certified-operators-fmppw\" (UID: \"6cbf0261-72a9-4700-bca1-614749812f81\") " pod="openshift-marketplace/certified-operators-fmppw" Mar 10 00:28:26 crc kubenswrapper[4906]: I0310 00:28:26.543250 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsg8w\" (UniqueName: \"kubernetes.io/projected/6cbf0261-72a9-4700-bca1-614749812f81-kube-api-access-jsg8w\") pod \"certified-operators-fmppw\" (UID: \"6cbf0261-72a9-4700-bca1-614749812f81\") " pod="openshift-marketplace/certified-operators-fmppw" Mar 10 00:28:26 crc kubenswrapper[4906]: I0310 00:28:26.543334 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cbf0261-72a9-4700-bca1-614749812f81-utilities\") pod \"certified-operators-fmppw\" (UID: \"6cbf0261-72a9-4700-bca1-614749812f81\") " pod="openshift-marketplace/certified-operators-fmppw" Mar 10 00:28:26 crc kubenswrapper[4906]: I0310 00:28:26.543970 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cbf0261-72a9-4700-bca1-614749812f81-utilities\") pod \"certified-operators-fmppw\" (UID: \"6cbf0261-72a9-4700-bca1-614749812f81\") " pod="openshift-marketplace/certified-operators-fmppw" Mar 10 00:28:26 crc kubenswrapper[4906]: I0310 00:28:26.544085 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cbf0261-72a9-4700-bca1-614749812f81-catalog-content\") pod \"certified-operators-fmppw\" (UID: \"6cbf0261-72a9-4700-bca1-614749812f81\") " pod="openshift-marketplace/certified-operators-fmppw" Mar 10 00:28:26 crc kubenswrapper[4906]: I0310 00:28:26.566195 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsg8w\" (UniqueName: \"kubernetes.io/projected/6cbf0261-72a9-4700-bca1-614749812f81-kube-api-access-jsg8w\") pod \"certified-operators-fmppw\" (UID: \"6cbf0261-72a9-4700-bca1-614749812f81\") " pod="openshift-marketplace/certified-operators-fmppw" Mar 10 00:28:26 crc kubenswrapper[4906]: I0310 00:28:26.614714 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fmppw" Mar 10 00:28:27 crc kubenswrapper[4906]: I0310 00:28:27.014619 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fmppw"] Mar 10 00:28:27 crc kubenswrapper[4906]: I0310 00:28:27.617394 4906 generic.go:334] "Generic (PLEG): container finished" podID="6cbf0261-72a9-4700-bca1-614749812f81" containerID="bdcebff5f4e9a0154dcfbfa4d1e7b812575117c033ef893fb4f65f18ba1a7c27" exitCode=0 Mar 10 00:28:27 crc kubenswrapper[4906]: I0310 00:28:27.617483 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmppw" event={"ID":"6cbf0261-72a9-4700-bca1-614749812f81","Type":"ContainerDied","Data":"bdcebff5f4e9a0154dcfbfa4d1e7b812575117c033ef893fb4f65f18ba1a7c27"} Mar 10 00:28:27 crc kubenswrapper[4906]: I0310 00:28:27.617655 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmppw" event={"ID":"6cbf0261-72a9-4700-bca1-614749812f81","Type":"ContainerStarted","Data":"9b0c823f42ff3da35f3b179ceb392cbf7fc4228f07484aeb33c91a3daa91b936"} Mar 10 00:28:28 crc kubenswrapper[4906]: I0310 00:28:28.629621 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmppw" event={"ID":"6cbf0261-72a9-4700-bca1-614749812f81","Type":"ContainerStarted","Data":"d91580c802ead27b805bd5e98c3651f1d6c18c4a44e723237c04f992677836b4"} Mar 10 00:28:28 crc kubenswrapper[4906]: I0310 00:28:28.633423 4906 generic.go:334] "Generic (PLEG): container finished" podID="3e366a01-d784-4d60-8e9f-f20451001fbc" containerID="7e0bb349735950deba4e263f335935fec2ae8dea8a7b2ca8ccc953a203dd60f0" exitCode=0 Mar 10 00:28:28 crc kubenswrapper[4906]: I0310 00:28:28.633453 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"3e366a01-d784-4d60-8e9f-f20451001fbc","Type":"ContainerDied","Data":"7e0bb349735950deba4e263f335935fec2ae8dea8a7b2ca8ccc953a203dd60f0"} Mar 10 00:28:29 crc kubenswrapper[4906]: I0310 00:28:29.653741 4906 generic.go:334] "Generic (PLEG): container finished" podID="6cbf0261-72a9-4700-bca1-614749812f81" containerID="d91580c802ead27b805bd5e98c3651f1d6c18c4a44e723237c04f992677836b4" exitCode=0 Mar 10 00:28:29 crc kubenswrapper[4906]: I0310 00:28:29.653893 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmppw" event={"ID":"6cbf0261-72a9-4700-bca1-614749812f81","Type":"ContainerDied","Data":"d91580c802ead27b805bd5e98c3651f1d6c18c4a44e723237c04f992677836b4"} Mar 10 00:28:29 crc kubenswrapper[4906]: I0310 00:28:29.951594 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.092171 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtmt6\" (UniqueName: \"kubernetes.io/projected/3e366a01-d784-4d60-8e9f-f20451001fbc-kube-api-access-rtmt6\") pod \"3e366a01-d784-4d60-8e9f-f20451001fbc\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.092233 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/3e366a01-d784-4d60-8e9f-f20451001fbc-builder-dockercfg-7plhd-push\") pod \"3e366a01-d784-4d60-8e9f-f20451001fbc\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.092252 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-build-blob-cache\") pod \"3e366a01-d784-4d60-8e9f-f20451001fbc\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.092281 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-container-storage-root\") pod \"3e366a01-d784-4d60-8e9f-f20451001fbc\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.092314 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-ca-bundles\") pod \"3e366a01-d784-4d60-8e9f-f20451001fbc\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.092329 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-container-storage-run\") pod \"3e366a01-d784-4d60-8e9f-f20451001fbc\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.092346 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3e366a01-d784-4d60-8e9f-f20451001fbc-node-pullsecrets\") pod \"3e366a01-d784-4d60-8e9f-f20451001fbc\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.092362 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3e366a01-d784-4d60-8e9f-f20451001fbc-buildcachedir\") pod \"3e366a01-d784-4d60-8e9f-f20451001fbc\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.092383 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-proxy-ca-bundles\") pod \"3e366a01-d784-4d60-8e9f-f20451001fbc\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.092397 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-buildworkdir\") pod \"3e366a01-d784-4d60-8e9f-f20451001fbc\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.092440 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-system-configs\") pod \"3e366a01-d784-4d60-8e9f-f20451001fbc\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.092464 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/3e366a01-d784-4d60-8e9f-f20451001fbc-builder-dockercfg-7plhd-pull\") pod \"3e366a01-d784-4d60-8e9f-f20451001fbc\" (UID: \"3e366a01-d784-4d60-8e9f-f20451001fbc\") " Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.093821 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e366a01-d784-4d60-8e9f-f20451001fbc-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "3e366a01-d784-4d60-8e9f-f20451001fbc" (UID: "3e366a01-d784-4d60-8e9f-f20451001fbc"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.093964 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "3e366a01-d784-4d60-8e9f-f20451001fbc" (UID: "3e366a01-d784-4d60-8e9f-f20451001fbc"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.094556 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "3e366a01-d784-4d60-8e9f-f20451001fbc" (UID: "3e366a01-d784-4d60-8e9f-f20451001fbc"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.094942 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e366a01-d784-4d60-8e9f-f20451001fbc-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "3e366a01-d784-4d60-8e9f-f20451001fbc" (UID: "3e366a01-d784-4d60-8e9f-f20451001fbc"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.094944 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "3e366a01-d784-4d60-8e9f-f20451001fbc" (UID: "3e366a01-d784-4d60-8e9f-f20451001fbc"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.095538 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "3e366a01-d784-4d60-8e9f-f20451001fbc" (UID: "3e366a01-d784-4d60-8e9f-f20451001fbc"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.096087 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "3e366a01-d784-4d60-8e9f-f20451001fbc" (UID: "3e366a01-d784-4d60-8e9f-f20451001fbc"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.098006 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e366a01-d784-4d60-8e9f-f20451001fbc-kube-api-access-rtmt6" (OuterVolumeSpecName: "kube-api-access-rtmt6") pod "3e366a01-d784-4d60-8e9f-f20451001fbc" (UID: "3e366a01-d784-4d60-8e9f-f20451001fbc"). InnerVolumeSpecName "kube-api-access-rtmt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.098529 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e366a01-d784-4d60-8e9f-f20451001fbc-builder-dockercfg-7plhd-push" (OuterVolumeSpecName: "builder-dockercfg-7plhd-push") pod "3e366a01-d784-4d60-8e9f-f20451001fbc" (UID: "3e366a01-d784-4d60-8e9f-f20451001fbc"). InnerVolumeSpecName "builder-dockercfg-7plhd-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.098833 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e366a01-d784-4d60-8e9f-f20451001fbc-builder-dockercfg-7plhd-pull" (OuterVolumeSpecName: "builder-dockercfg-7plhd-pull") pod "3e366a01-d784-4d60-8e9f-f20451001fbc" (UID: "3e366a01-d784-4d60-8e9f-f20451001fbc"). InnerVolumeSpecName "builder-dockercfg-7plhd-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.194164 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtmt6\" (UniqueName: \"kubernetes.io/projected/3e366a01-d784-4d60-8e9f-f20451001fbc-kube-api-access-rtmt6\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.194202 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/3e366a01-d784-4d60-8e9f-f20451001fbc-builder-dockercfg-7plhd-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.194220 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.194237 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.194250 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3e366a01-d784-4d60-8e9f-f20451001fbc-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.194263 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3e366a01-d784-4d60-8e9f-f20451001fbc-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.194277 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.194291 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.194304 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3e366a01-d784-4d60-8e9f-f20451001fbc-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.194317 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/3e366a01-d784-4d60-8e9f-f20451001fbc-builder-dockercfg-7plhd-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.211564 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "3e366a01-d784-4d60-8e9f-f20451001fbc" (UID: "3e366a01-d784-4d60-8e9f-f20451001fbc"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.296765 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.505127 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.505185 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.662621 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"3e366a01-d784-4d60-8e9f-f20451001fbc","Type":"ContainerDied","Data":"f19e13ba6f2a5425d2374bc44104422e5adbbf3d6b7d46f0ccffd80613f48c1c"} Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.662728 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f19e13ba6f2a5425d2374bc44104422e5adbbf3d6b7d46f0ccffd80613f48c1c" Mar 10 00:28:30 crc kubenswrapper[4906]: I0310 00:28:30.662828 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 10 00:28:31 crc kubenswrapper[4906]: I0310 00:28:31.011141 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "3e366a01-d784-4d60-8e9f-f20451001fbc" (UID: "3e366a01-d784-4d60-8e9f-f20451001fbc"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:31 crc kubenswrapper[4906]: I0310 00:28:31.011827 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3e366a01-d784-4d60-8e9f-f20451001fbc-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:31 crc kubenswrapper[4906]: I0310 00:28:31.674438 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmppw" event={"ID":"6cbf0261-72a9-4700-bca1-614749812f81","Type":"ContainerStarted","Data":"7b7ba308392c3dba50ac0c4abf61e8687d6b2cc3dd537a3a35f21bb0dcf18f9f"} Mar 10 00:28:31 crc kubenswrapper[4906]: I0310 00:28:31.703875 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fmppw" podStartSLOduration=2.6779345599999997 podStartE2EDuration="5.703854905s" podCreationTimestamp="2026-03-10 00:28:26 +0000 UTC" firstStartedPulling="2026-03-10 00:28:27.619153746 +0000 UTC m=+1333.767048858" lastFinishedPulling="2026-03-10 00:28:30.645074081 +0000 UTC m=+1336.792969203" observedRunningTime="2026-03-10 00:28:31.697098676 +0000 UTC m=+1337.844993868" watchObservedRunningTime="2026-03-10 00:28:31.703854905 +0000 UTC m=+1337.851750027" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.903250 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:28:34 crc kubenswrapper[4906]: E0310 00:28:34.904172 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e366a01-d784-4d60-8e9f-f20451001fbc" containerName="manage-dockerfile" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.904203 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e366a01-d784-4d60-8e9f-f20451001fbc" containerName="manage-dockerfile" Mar 10 00:28:34 crc kubenswrapper[4906]: E0310 00:28:34.904233 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e366a01-d784-4d60-8e9f-f20451001fbc" containerName="git-clone" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.904249 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e366a01-d784-4d60-8e9f-f20451001fbc" containerName="git-clone" Mar 10 00:28:34 crc kubenswrapper[4906]: E0310 00:28:34.904272 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e366a01-d784-4d60-8e9f-f20451001fbc" containerName="docker-build" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.904289 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e366a01-d784-4d60-8e9f-f20451001fbc" containerName="docker-build" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.904700 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e366a01-d784-4d60-8e9f-f20451001fbc" containerName="docker-build" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.905949 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.908695 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.908910 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.908936 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-7plhd" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.909899 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.928914 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.968752 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/e028fc38-e4e1-4979-94f7-4d064ac8ed87-builder-dockercfg-7plhd-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.969223 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.969261 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e028fc38-e4e1-4979-94f7-4d064ac8ed87-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.969306 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.969338 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e028fc38-e4e1-4979-94f7-4d064ac8ed87-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.969580 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.969704 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.969794 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.969854 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.969909 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/e028fc38-e4e1-4979-94f7-4d064ac8ed87-builder-dockercfg-7plhd-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.969990 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs5fx\" (UniqueName: \"kubernetes.io/projected/e028fc38-e4e1-4979-94f7-4d064ac8ed87-kube-api-access-gs5fx\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:34 crc kubenswrapper[4906]: I0310 00:28:34.970056 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.072088 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs5fx\" (UniqueName: \"kubernetes.io/projected/e028fc38-e4e1-4979-94f7-4d064ac8ed87-kube-api-access-gs5fx\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.072183 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.072255 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/e028fc38-e4e1-4979-94f7-4d064ac8ed87-builder-dockercfg-7plhd-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.072293 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.072331 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e028fc38-e4e1-4979-94f7-4d064ac8ed87-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.072382 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.072417 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e028fc38-e4e1-4979-94f7-4d064ac8ed87-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.072482 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.072530 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.072569 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.072606 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.072709 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/e028fc38-e4e1-4979-94f7-4d064ac8ed87-builder-dockercfg-7plhd-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.073004 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e028fc38-e4e1-4979-94f7-4d064ac8ed87-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.073479 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.073547 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e028fc38-e4e1-4979-94f7-4d064ac8ed87-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.073594 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.073659 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.073939 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.074673 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.074736 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.075824 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.084351 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/e028fc38-e4e1-4979-94f7-4d064ac8ed87-builder-dockercfg-7plhd-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.084573 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/e028fc38-e4e1-4979-94f7-4d064ac8ed87-builder-dockercfg-7plhd-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.102934 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs5fx\" (UniqueName: \"kubernetes.io/projected/e028fc38-e4e1-4979-94f7-4d064ac8ed87-kube-api-access-gs5fx\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.239096 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.563930 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:28:35 crc kubenswrapper[4906]: W0310 00:28:35.570489 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode028fc38_e4e1_4979_94f7_4d064ac8ed87.slice/crio-2373bfbcf56802fafb2a4015f5ed842bcaf3670093e5598c7d8f6e33f48318ba WatchSource:0}: Error finding container 2373bfbcf56802fafb2a4015f5ed842bcaf3670093e5598c7d8f6e33f48318ba: Status 404 returned error can't find the container with id 2373bfbcf56802fafb2a4015f5ed842bcaf3670093e5598c7d8f6e33f48318ba Mar 10 00:28:35 crc kubenswrapper[4906]: I0310 00:28:35.727860 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e028fc38-e4e1-4979-94f7-4d064ac8ed87","Type":"ContainerStarted","Data":"2373bfbcf56802fafb2a4015f5ed842bcaf3670093e5598c7d8f6e33f48318ba"} Mar 10 00:28:36 crc kubenswrapper[4906]: I0310 00:28:36.615274 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fmppw" Mar 10 00:28:36 crc kubenswrapper[4906]: I0310 00:28:36.615528 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fmppw" Mar 10 00:28:36 crc kubenswrapper[4906]: I0310 00:28:36.674767 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fmppw" Mar 10 00:28:36 crc kubenswrapper[4906]: I0310 00:28:36.738306 4906 generic.go:334] "Generic (PLEG): container finished" podID="e028fc38-e4e1-4979-94f7-4d064ac8ed87" containerID="308ff932354439204db714059122252e2a210a8c56b1e6434d549f4cbde19db6" exitCode=0 Mar 10 00:28:36 crc kubenswrapper[4906]: I0310 00:28:36.738393 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e028fc38-e4e1-4979-94f7-4d064ac8ed87","Type":"ContainerDied","Data":"308ff932354439204db714059122252e2a210a8c56b1e6434d549f4cbde19db6"} Mar 10 00:28:36 crc kubenswrapper[4906]: I0310 00:28:36.810726 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fmppw" Mar 10 00:28:36 crc kubenswrapper[4906]: I0310 00:28:36.920064 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fmppw"] Mar 10 00:28:37 crc kubenswrapper[4906]: I0310 00:28:37.750544 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e028fc38-e4e1-4979-94f7-4d064ac8ed87","Type":"ContainerStarted","Data":"78f9013f5f81d6d15b4c51dc736a6d79a277668510c0ec401f41faf8942ee8b0"} Mar 10 00:28:37 crc kubenswrapper[4906]: I0310 00:28:37.793484 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=3.793460792 podStartE2EDuration="3.793460792s" podCreationTimestamp="2026-03-10 00:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:28:37.789414769 +0000 UTC m=+1343.937309881" watchObservedRunningTime="2026-03-10 00:28:37.793460792 +0000 UTC m=+1343.941355944" Mar 10 00:28:38 crc kubenswrapper[4906]: I0310 00:28:38.760273 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fmppw" podUID="6cbf0261-72a9-4700-bca1-614749812f81" containerName="registry-server" containerID="cri-o://7b7ba308392c3dba50ac0c4abf61e8687d6b2cc3dd537a3a35f21bb0dcf18f9f" gracePeriod=2 Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.209092 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fmppw" Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.334364 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cbf0261-72a9-4700-bca1-614749812f81-catalog-content\") pod \"6cbf0261-72a9-4700-bca1-614749812f81\" (UID: \"6cbf0261-72a9-4700-bca1-614749812f81\") " Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.334478 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cbf0261-72a9-4700-bca1-614749812f81-utilities\") pod \"6cbf0261-72a9-4700-bca1-614749812f81\" (UID: \"6cbf0261-72a9-4700-bca1-614749812f81\") " Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.334528 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsg8w\" (UniqueName: \"kubernetes.io/projected/6cbf0261-72a9-4700-bca1-614749812f81-kube-api-access-jsg8w\") pod \"6cbf0261-72a9-4700-bca1-614749812f81\" (UID: \"6cbf0261-72a9-4700-bca1-614749812f81\") " Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.335422 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cbf0261-72a9-4700-bca1-614749812f81-utilities" (OuterVolumeSpecName: "utilities") pod "6cbf0261-72a9-4700-bca1-614749812f81" (UID: "6cbf0261-72a9-4700-bca1-614749812f81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.344904 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cbf0261-72a9-4700-bca1-614749812f81-kube-api-access-jsg8w" (OuterVolumeSpecName: "kube-api-access-jsg8w") pod "6cbf0261-72a9-4700-bca1-614749812f81" (UID: "6cbf0261-72a9-4700-bca1-614749812f81"). InnerVolumeSpecName "kube-api-access-jsg8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.401846 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cbf0261-72a9-4700-bca1-614749812f81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6cbf0261-72a9-4700-bca1-614749812f81" (UID: "6cbf0261-72a9-4700-bca1-614749812f81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.435812 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cbf0261-72a9-4700-bca1-614749812f81-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.435847 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cbf0261-72a9-4700-bca1-614749812f81-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.435858 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsg8w\" (UniqueName: \"kubernetes.io/projected/6cbf0261-72a9-4700-bca1-614749812f81-kube-api-access-jsg8w\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.769425 4906 generic.go:334] "Generic (PLEG): container finished" podID="6cbf0261-72a9-4700-bca1-614749812f81" containerID="7b7ba308392c3dba50ac0c4abf61e8687d6b2cc3dd537a3a35f21bb0dcf18f9f" exitCode=0 Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.769492 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmppw" event={"ID":"6cbf0261-72a9-4700-bca1-614749812f81","Type":"ContainerDied","Data":"7b7ba308392c3dba50ac0c4abf61e8687d6b2cc3dd537a3a35f21bb0dcf18f9f"} Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.769551 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fmppw" event={"ID":"6cbf0261-72a9-4700-bca1-614749812f81","Type":"ContainerDied","Data":"9b0c823f42ff3da35f3b179ceb392cbf7fc4228f07484aeb33c91a3daa91b936"} Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.769562 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fmppw" Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.769590 4906 scope.go:117] "RemoveContainer" containerID="7b7ba308392c3dba50ac0c4abf61e8687d6b2cc3dd537a3a35f21bb0dcf18f9f" Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.786918 4906 scope.go:117] "RemoveContainer" containerID="d91580c802ead27b805bd5e98c3651f1d6c18c4a44e723237c04f992677836b4" Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.821177 4906 scope.go:117] "RemoveContainer" containerID="bdcebff5f4e9a0154dcfbfa4d1e7b812575117c033ef893fb4f65f18ba1a7c27" Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.827491 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fmppw"] Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.837726 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fmppw"] Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.838676 4906 scope.go:117] "RemoveContainer" containerID="7b7ba308392c3dba50ac0c4abf61e8687d6b2cc3dd537a3a35f21bb0dcf18f9f" Mar 10 00:28:39 crc kubenswrapper[4906]: E0310 00:28:39.839036 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b7ba308392c3dba50ac0c4abf61e8687d6b2cc3dd537a3a35f21bb0dcf18f9f\": container with ID starting with 7b7ba308392c3dba50ac0c4abf61e8687d6b2cc3dd537a3a35f21bb0dcf18f9f not found: ID does not exist" containerID="7b7ba308392c3dba50ac0c4abf61e8687d6b2cc3dd537a3a35f21bb0dcf18f9f" Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.839092 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b7ba308392c3dba50ac0c4abf61e8687d6b2cc3dd537a3a35f21bb0dcf18f9f"} err="failed to get container status \"7b7ba308392c3dba50ac0c4abf61e8687d6b2cc3dd537a3a35f21bb0dcf18f9f\": rpc error: code = NotFound desc = could not find container \"7b7ba308392c3dba50ac0c4abf61e8687d6b2cc3dd537a3a35f21bb0dcf18f9f\": container with ID starting with 7b7ba308392c3dba50ac0c4abf61e8687d6b2cc3dd537a3a35f21bb0dcf18f9f not found: ID does not exist" Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.839118 4906 scope.go:117] "RemoveContainer" containerID="d91580c802ead27b805bd5e98c3651f1d6c18c4a44e723237c04f992677836b4" Mar 10 00:28:39 crc kubenswrapper[4906]: E0310 00:28:39.839463 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d91580c802ead27b805bd5e98c3651f1d6c18c4a44e723237c04f992677836b4\": container with ID starting with d91580c802ead27b805bd5e98c3651f1d6c18c4a44e723237c04f992677836b4 not found: ID does not exist" containerID="d91580c802ead27b805bd5e98c3651f1d6c18c4a44e723237c04f992677836b4" Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.839497 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d91580c802ead27b805bd5e98c3651f1d6c18c4a44e723237c04f992677836b4"} err="failed to get container status \"d91580c802ead27b805bd5e98c3651f1d6c18c4a44e723237c04f992677836b4\": rpc error: code = NotFound desc = could not find container \"d91580c802ead27b805bd5e98c3651f1d6c18c4a44e723237c04f992677836b4\": container with ID starting with d91580c802ead27b805bd5e98c3651f1d6c18c4a44e723237c04f992677836b4 not found: ID does not exist" Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.839523 4906 scope.go:117] "RemoveContainer" containerID="bdcebff5f4e9a0154dcfbfa4d1e7b812575117c033ef893fb4f65f18ba1a7c27" Mar 10 00:28:39 crc kubenswrapper[4906]: E0310 00:28:39.839841 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdcebff5f4e9a0154dcfbfa4d1e7b812575117c033ef893fb4f65f18ba1a7c27\": container with ID starting with bdcebff5f4e9a0154dcfbfa4d1e7b812575117c033ef893fb4f65f18ba1a7c27 not found: ID does not exist" containerID="bdcebff5f4e9a0154dcfbfa4d1e7b812575117c033ef893fb4f65f18ba1a7c27" Mar 10 00:28:39 crc kubenswrapper[4906]: I0310 00:28:39.839864 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdcebff5f4e9a0154dcfbfa4d1e7b812575117c033ef893fb4f65f18ba1a7c27"} err="failed to get container status \"bdcebff5f4e9a0154dcfbfa4d1e7b812575117c033ef893fb4f65f18ba1a7c27\": rpc error: code = NotFound desc = could not find container \"bdcebff5f4e9a0154dcfbfa4d1e7b812575117c033ef893fb4f65f18ba1a7c27\": container with ID starting with bdcebff5f4e9a0154dcfbfa4d1e7b812575117c033ef893fb4f65f18ba1a7c27 not found: ID does not exist" Mar 10 00:28:40 crc kubenswrapper[4906]: I0310 00:28:40.584775 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cbf0261-72a9-4700-bca1-614749812f81" path="/var/lib/kubelet/pods/6cbf0261-72a9-4700-bca1-614749812f81/volumes" Mar 10 00:28:45 crc kubenswrapper[4906]: I0310 00:28:45.503166 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:28:45 crc kubenswrapper[4906]: I0310 00:28:45.505841 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="e028fc38-e4e1-4979-94f7-4d064ac8ed87" containerName="docker-build" containerID="cri-o://78f9013f5f81d6d15b4c51dc736a6d79a277668510c0ec401f41faf8942ee8b0" gracePeriod=30 Mar 10 00:28:45 crc kubenswrapper[4906]: I0310 00:28:45.827402 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_e028fc38-e4e1-4979-94f7-4d064ac8ed87/docker-build/0.log" Mar 10 00:28:45 crc kubenswrapper[4906]: I0310 00:28:45.828085 4906 generic.go:334] "Generic (PLEG): container finished" podID="e028fc38-e4e1-4979-94f7-4d064ac8ed87" containerID="78f9013f5f81d6d15b4c51dc736a6d79a277668510c0ec401f41faf8942ee8b0" exitCode=1 Mar 10 00:28:45 crc kubenswrapper[4906]: I0310 00:28:45.828126 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e028fc38-e4e1-4979-94f7-4d064ac8ed87","Type":"ContainerDied","Data":"78f9013f5f81d6d15b4c51dc736a6d79a277668510c0ec401f41faf8942ee8b0"} Mar 10 00:28:45 crc kubenswrapper[4906]: I0310 00:28:45.936973 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_e028fc38-e4e1-4979-94f7-4d064ac8ed87/docker-build/0.log" Mar 10 00:28:45 crc kubenswrapper[4906]: I0310 00:28:45.938068 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.025580 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/e028fc38-e4e1-4979-94f7-4d064ac8ed87-builder-dockercfg-7plhd-pull\") pod \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.025663 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-container-storage-root\") pod \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.025719 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e028fc38-e4e1-4979-94f7-4d064ac8ed87-node-pullsecrets\") pod \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.025764 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-container-storage-run\") pod \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.025788 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs5fx\" (UniqueName: \"kubernetes.io/projected/e028fc38-e4e1-4979-94f7-4d064ac8ed87-kube-api-access-gs5fx\") pod \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.025950 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e028fc38-e4e1-4979-94f7-4d064ac8ed87-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e028fc38-e4e1-4979-94f7-4d064ac8ed87" (UID: "e028fc38-e4e1-4979-94f7-4d064ac8ed87"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.025996 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/e028fc38-e4e1-4979-94f7-4d064ac8ed87-builder-dockercfg-7plhd-push\") pod \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.026982 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e028fc38-e4e1-4979-94f7-4d064ac8ed87" (UID: "e028fc38-e4e1-4979-94f7-4d064ac8ed87"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.027006 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-system-configs\") pod \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.027037 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-buildworkdir\") pod \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.027087 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e028fc38-e4e1-4979-94f7-4d064ac8ed87-buildcachedir\") pod \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.027134 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-proxy-ca-bundles\") pod \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.027156 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-blob-cache\") pod \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.027182 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-ca-bundles\") pod \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\" (UID: \"e028fc38-e4e1-4979-94f7-4d064ac8ed87\") " Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.027255 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e028fc38-e4e1-4979-94f7-4d064ac8ed87-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e028fc38-e4e1-4979-94f7-4d064ac8ed87" (UID: "e028fc38-e4e1-4979-94f7-4d064ac8ed87"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.027781 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e028fc38-e4e1-4979-94f7-4d064ac8ed87-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.027805 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.027840 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e028fc38-e4e1-4979-94f7-4d064ac8ed87-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.030198 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e028fc38-e4e1-4979-94f7-4d064ac8ed87" (UID: "e028fc38-e4e1-4979-94f7-4d064ac8ed87"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.030285 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e028fc38-e4e1-4979-94f7-4d064ac8ed87" (UID: "e028fc38-e4e1-4979-94f7-4d064ac8ed87"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.030507 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e028fc38-e4e1-4979-94f7-4d064ac8ed87" (UID: "e028fc38-e4e1-4979-94f7-4d064ac8ed87"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.030814 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e028fc38-e4e1-4979-94f7-4d064ac8ed87" (UID: "e028fc38-e4e1-4979-94f7-4d064ac8ed87"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.032856 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e028fc38-e4e1-4979-94f7-4d064ac8ed87-builder-dockercfg-7plhd-push" (OuterVolumeSpecName: "builder-dockercfg-7plhd-push") pod "e028fc38-e4e1-4979-94f7-4d064ac8ed87" (UID: "e028fc38-e4e1-4979-94f7-4d064ac8ed87"). InnerVolumeSpecName "builder-dockercfg-7plhd-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.032927 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e028fc38-e4e1-4979-94f7-4d064ac8ed87-kube-api-access-gs5fx" (OuterVolumeSpecName: "kube-api-access-gs5fx") pod "e028fc38-e4e1-4979-94f7-4d064ac8ed87" (UID: "e028fc38-e4e1-4979-94f7-4d064ac8ed87"). InnerVolumeSpecName "kube-api-access-gs5fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.033833 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e028fc38-e4e1-4979-94f7-4d064ac8ed87-builder-dockercfg-7plhd-pull" (OuterVolumeSpecName: "builder-dockercfg-7plhd-pull") pod "e028fc38-e4e1-4979-94f7-4d064ac8ed87" (UID: "e028fc38-e4e1-4979-94f7-4d064ac8ed87"). InnerVolumeSpecName "builder-dockercfg-7plhd-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.120419 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e028fc38-e4e1-4979-94f7-4d064ac8ed87" (UID: "e028fc38-e4e1-4979-94f7-4d064ac8ed87"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.129210 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs5fx\" (UniqueName: \"kubernetes.io/projected/e028fc38-e4e1-4979-94f7-4d064ac8ed87-kube-api-access-gs5fx\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.129249 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.129268 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/e028fc38-e4e1-4979-94f7-4d064ac8ed87-builder-dockercfg-7plhd-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.129286 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.129304 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.129320 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.129338 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e028fc38-e4e1-4979-94f7-4d064ac8ed87-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.129355 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/e028fc38-e4e1-4979-94f7-4d064ac8ed87-builder-dockercfg-7plhd-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.426308 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e028fc38-e4e1-4979-94f7-4d064ac8ed87" (UID: "e028fc38-e4e1-4979-94f7-4d064ac8ed87"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.433251 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e028fc38-e4e1-4979-94f7-4d064ac8ed87-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.843894 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_e028fc38-e4e1-4979-94f7-4d064ac8ed87/docker-build/0.log" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.845006 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"e028fc38-e4e1-4979-94f7-4d064ac8ed87","Type":"ContainerDied","Data":"2373bfbcf56802fafb2a4015f5ed842bcaf3670093e5598c7d8f6e33f48318ba"} Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.845114 4906 scope.go:117] "RemoveContainer" containerID="78f9013f5f81d6d15b4c51dc736a6d79a277668510c0ec401f41faf8942ee8b0" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.845257 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.879345 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.894761 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 10 00:28:46 crc kubenswrapper[4906]: I0310 00:28:46.898754 4906 scope.go:117] "RemoveContainer" containerID="308ff932354439204db714059122252e2a210a8c56b1e6434d549f4cbde19db6" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.139014 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 10 00:28:47 crc kubenswrapper[4906]: E0310 00:28:47.139793 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e028fc38-e4e1-4979-94f7-4d064ac8ed87" containerName="manage-dockerfile" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.139821 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="e028fc38-e4e1-4979-94f7-4d064ac8ed87" containerName="manage-dockerfile" Mar 10 00:28:47 crc kubenswrapper[4906]: E0310 00:28:47.139840 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbf0261-72a9-4700-bca1-614749812f81" containerName="extract-utilities" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.139853 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbf0261-72a9-4700-bca1-614749812f81" containerName="extract-utilities" Mar 10 00:28:47 crc kubenswrapper[4906]: E0310 00:28:47.139868 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbf0261-72a9-4700-bca1-614749812f81" containerName="extract-content" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.139882 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbf0261-72a9-4700-bca1-614749812f81" containerName="extract-content" Mar 10 00:28:47 crc kubenswrapper[4906]: E0310 00:28:47.139901 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e028fc38-e4e1-4979-94f7-4d064ac8ed87" containerName="docker-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.139914 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="e028fc38-e4e1-4979-94f7-4d064ac8ed87" containerName="docker-build" Mar 10 00:28:47 crc kubenswrapper[4906]: E0310 00:28:47.139948 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbf0261-72a9-4700-bca1-614749812f81" containerName="registry-server" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.139961 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbf0261-72a9-4700-bca1-614749812f81" containerName="registry-server" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.140150 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="e028fc38-e4e1-4979-94f7-4d064ac8ed87" containerName="docker-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.140187 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbf0261-72a9-4700-bca1-614749812f81" containerName="registry-server" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.141720 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.144557 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.144588 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.145585 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.146316 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-7plhd" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.222222 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.246491 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.246546 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-builder-dockercfg-7plhd-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.246579 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.246614 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-builder-dockercfg-7plhd-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.246681 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.246827 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.246958 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.247009 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.247049 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.247090 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.247119 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.247172 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n6f4\" (UniqueName: \"kubernetes.io/projected/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-kube-api-access-9n6f4\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.348433 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.348484 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.348506 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.348528 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.348552 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.348595 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n6f4\" (UniqueName: \"kubernetes.io/projected/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-kube-api-access-9n6f4\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.348629 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.348670 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-builder-dockercfg-7plhd-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.348691 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.348715 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-builder-dockercfg-7plhd-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.348736 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.348766 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.348998 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.349252 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.349421 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.349711 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.349792 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.349963 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.349995 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.350186 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.350418 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.355336 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-builder-dockercfg-7plhd-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.355339 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-builder-dockercfg-7plhd-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.371016 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n6f4\" (UniqueName: \"kubernetes.io/projected/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-kube-api-access-9n6f4\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.465363 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.730702 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 10 00:28:47 crc kubenswrapper[4906]: I0310 00:28:47.854981 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"d657e1e1-a1b0-400f-aab1-ce179ebbaf93","Type":"ContainerStarted","Data":"cecdd852a18fab9d35a1e2bb59d3b4a8cd347dd3d31bf1b5d3a6addd2ae3a840"} Mar 10 00:28:48 crc kubenswrapper[4906]: I0310 00:28:48.586894 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e028fc38-e4e1-4979-94f7-4d064ac8ed87" path="/var/lib/kubelet/pods/e028fc38-e4e1-4979-94f7-4d064ac8ed87/volumes" Mar 10 00:28:48 crc kubenswrapper[4906]: I0310 00:28:48.867245 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"d657e1e1-a1b0-400f-aab1-ce179ebbaf93","Type":"ContainerStarted","Data":"e66d6e2877d9d725ee0912e228c51db4170a77ac9fbe2fe81f83c3b741c2abc3"} Mar 10 00:28:49 crc kubenswrapper[4906]: I0310 00:28:49.879460 4906 generic.go:334] "Generic (PLEG): container finished" podID="d657e1e1-a1b0-400f-aab1-ce179ebbaf93" containerID="e66d6e2877d9d725ee0912e228c51db4170a77ac9fbe2fe81f83c3b741c2abc3" exitCode=0 Mar 10 00:28:49 crc kubenswrapper[4906]: I0310 00:28:49.879573 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"d657e1e1-a1b0-400f-aab1-ce179ebbaf93","Type":"ContainerDied","Data":"e66d6e2877d9d725ee0912e228c51db4170a77ac9fbe2fe81f83c3b741c2abc3"} Mar 10 00:28:50 crc kubenswrapper[4906]: I0310 00:28:50.891236 4906 generic.go:334] "Generic (PLEG): container finished" podID="d657e1e1-a1b0-400f-aab1-ce179ebbaf93" containerID="ffde67c094e00e53126327d3519729c2aa3f36dfecf9daa4964d0e44947240de" exitCode=0 Mar 10 00:28:50 crc kubenswrapper[4906]: I0310 00:28:50.891341 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"d657e1e1-a1b0-400f-aab1-ce179ebbaf93","Type":"ContainerDied","Data":"ffde67c094e00e53126327d3519729c2aa3f36dfecf9daa4964d0e44947240de"} Mar 10 00:28:50 crc kubenswrapper[4906]: I0310 00:28:50.940745 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_d657e1e1-a1b0-400f-aab1-ce179ebbaf93/manage-dockerfile/0.log" Mar 10 00:28:51 crc kubenswrapper[4906]: I0310 00:28:51.901896 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"d657e1e1-a1b0-400f-aab1-ce179ebbaf93","Type":"ContainerStarted","Data":"01d035b2e6bc06c8a7671ffe77ae6462a663663f592f356a0e35be4809e1319a"} Mar 10 00:28:51 crc kubenswrapper[4906]: I0310 00:28:51.934782 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=4.934756625 podStartE2EDuration="4.934756625s" podCreationTimestamp="2026-03-10 00:28:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:28:51.926984818 +0000 UTC m=+1358.074879980" watchObservedRunningTime="2026-03-10 00:28:51.934756625 +0000 UTC m=+1358.082651747" Mar 10 00:29:00 crc kubenswrapper[4906]: I0310 00:29:00.503025 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:29:00 crc kubenswrapper[4906]: I0310 00:29:00.503888 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:29:00 crc kubenswrapper[4906]: I0310 00:29:00.503976 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:29:00 crc kubenswrapper[4906]: I0310 00:29:00.505313 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6847583fc3b7bdeec69f6786020e94f393a41e01c5039c9b2618c4b51a1b7db5"} pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:29:00 crc kubenswrapper[4906]: I0310 00:29:00.505443 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" containerID="cri-o://6847583fc3b7bdeec69f6786020e94f393a41e01c5039c9b2618c4b51a1b7db5" gracePeriod=600 Mar 10 00:29:00 crc kubenswrapper[4906]: I0310 00:29:00.967909 4906 generic.go:334] "Generic (PLEG): container finished" podID="72d61d35-0a64-45a5-8df3-9c429727deba" containerID="6847583fc3b7bdeec69f6786020e94f393a41e01c5039c9b2618c4b51a1b7db5" exitCode=0 Mar 10 00:29:00 crc kubenswrapper[4906]: I0310 00:29:00.968002 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerDied","Data":"6847583fc3b7bdeec69f6786020e94f393a41e01c5039c9b2618c4b51a1b7db5"} Mar 10 00:29:00 crc kubenswrapper[4906]: I0310 00:29:00.968480 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerStarted","Data":"3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3"} Mar 10 00:29:00 crc kubenswrapper[4906]: I0310 00:29:00.968532 4906 scope.go:117] "RemoveContainer" containerID="ecbb89ce657e5a301b803c163bbd70823102707b301c704efb62181f5820ef4b" Mar 10 00:29:44 crc kubenswrapper[4906]: I0310 00:29:44.293077 4906 generic.go:334] "Generic (PLEG): container finished" podID="d657e1e1-a1b0-400f-aab1-ce179ebbaf93" containerID="01d035b2e6bc06c8a7671ffe77ae6462a663663f592f356a0e35be4809e1319a" exitCode=0 Mar 10 00:29:44 crc kubenswrapper[4906]: I0310 00:29:44.293172 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"d657e1e1-a1b0-400f-aab1-ce179ebbaf93","Type":"ContainerDied","Data":"01d035b2e6bc06c8a7671ffe77ae6462a663663f592f356a0e35be4809e1319a"} Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.574771 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.622777 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-system-configs\") pod \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.622818 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-node-pullsecrets\") pod \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.623190 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d657e1e1-a1b0-400f-aab1-ce179ebbaf93" (UID: "d657e1e1-a1b0-400f-aab1-ce179ebbaf93"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.623266 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-builder-dockercfg-7plhd-push\") pod \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.623358 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-blob-cache\") pod \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.623406 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n6f4\" (UniqueName: \"kubernetes.io/projected/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-kube-api-access-9n6f4\") pod \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.623443 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-proxy-ca-bundles\") pod \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.623466 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-container-storage-run\") pod \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.623486 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-ca-bundles\") pod \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.623504 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-buildcachedir\") pod \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.623526 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-container-storage-root\") pod \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.623546 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-buildworkdir\") pod \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.623566 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-builder-dockercfg-7plhd-pull\") pod \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\" (UID: \"d657e1e1-a1b0-400f-aab1-ce179ebbaf93\") " Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.623750 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d657e1e1-a1b0-400f-aab1-ce179ebbaf93" (UID: "d657e1e1-a1b0-400f-aab1-ce179ebbaf93"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.623788 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d657e1e1-a1b0-400f-aab1-ce179ebbaf93" (UID: "d657e1e1-a1b0-400f-aab1-ce179ebbaf93"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.624202 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.624224 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.624239 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.624485 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d657e1e1-a1b0-400f-aab1-ce179ebbaf93" (UID: "d657e1e1-a1b0-400f-aab1-ce179ebbaf93"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.624734 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d657e1e1-a1b0-400f-aab1-ce179ebbaf93" (UID: "d657e1e1-a1b0-400f-aab1-ce179ebbaf93"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.626666 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d657e1e1-a1b0-400f-aab1-ce179ebbaf93" (UID: "d657e1e1-a1b0-400f-aab1-ce179ebbaf93"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.627009 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d657e1e1-a1b0-400f-aab1-ce179ebbaf93" (UID: "d657e1e1-a1b0-400f-aab1-ce179ebbaf93"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.630159 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-builder-dockercfg-7plhd-push" (OuterVolumeSpecName: "builder-dockercfg-7plhd-push") pod "d657e1e1-a1b0-400f-aab1-ce179ebbaf93" (UID: "d657e1e1-a1b0-400f-aab1-ce179ebbaf93"). InnerVolumeSpecName "builder-dockercfg-7plhd-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.630180 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-builder-dockercfg-7plhd-pull" (OuterVolumeSpecName: "builder-dockercfg-7plhd-pull") pod "d657e1e1-a1b0-400f-aab1-ce179ebbaf93" (UID: "d657e1e1-a1b0-400f-aab1-ce179ebbaf93"). InnerVolumeSpecName "builder-dockercfg-7plhd-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.632534 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-kube-api-access-9n6f4" (OuterVolumeSpecName: "kube-api-access-9n6f4") pod "d657e1e1-a1b0-400f-aab1-ce179ebbaf93" (UID: "d657e1e1-a1b0-400f-aab1-ce179ebbaf93"). InnerVolumeSpecName "kube-api-access-9n6f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.725898 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-builder-dockercfg-7plhd-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.725930 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n6f4\" (UniqueName: \"kubernetes.io/projected/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-kube-api-access-9n6f4\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.725939 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.725947 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.725956 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.725965 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.725973 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-builder-dockercfg-7plhd-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.744753 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d657e1e1-a1b0-400f-aab1-ce179ebbaf93" (UID: "d657e1e1-a1b0-400f-aab1-ce179ebbaf93"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:45 crc kubenswrapper[4906]: I0310 00:29:45.827229 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:46 crc kubenswrapper[4906]: I0310 00:29:46.314074 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"d657e1e1-a1b0-400f-aab1-ce179ebbaf93","Type":"ContainerDied","Data":"cecdd852a18fab9d35a1e2bb59d3b4a8cd347dd3d31bf1b5d3a6addd2ae3a840"} Mar 10 00:29:46 crc kubenswrapper[4906]: I0310 00:29:46.314446 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cecdd852a18fab9d35a1e2bb59d3b4a8cd347dd3d31bf1b5d3a6addd2ae3a840" Mar 10 00:29:46 crc kubenswrapper[4906]: I0310 00:29:46.314250 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 10 00:29:46 crc kubenswrapper[4906]: I0310 00:29:46.628335 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d657e1e1-a1b0-400f-aab1-ce179ebbaf93" (UID: "d657e1e1-a1b0-400f-aab1-ce179ebbaf93"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:46 crc kubenswrapper[4906]: I0310 00:29:46.640831 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d657e1e1-a1b0-400f-aab1-ce179ebbaf93-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.005514 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 10 00:29:55 crc kubenswrapper[4906]: E0310 00:29:55.007416 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d657e1e1-a1b0-400f-aab1-ce179ebbaf93" containerName="git-clone" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.007499 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="d657e1e1-a1b0-400f-aab1-ce179ebbaf93" containerName="git-clone" Mar 10 00:29:55 crc kubenswrapper[4906]: E0310 00:29:55.007589 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d657e1e1-a1b0-400f-aab1-ce179ebbaf93" containerName="manage-dockerfile" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.007729 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="d657e1e1-a1b0-400f-aab1-ce179ebbaf93" containerName="manage-dockerfile" Mar 10 00:29:55 crc kubenswrapper[4906]: E0310 00:29:55.007809 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d657e1e1-a1b0-400f-aab1-ce179ebbaf93" containerName="docker-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.007869 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="d657e1e1-a1b0-400f-aab1-ce179ebbaf93" containerName="docker-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.008037 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="d657e1e1-a1b0-400f-aab1-ce179ebbaf93" containerName="docker-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.008690 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.010912 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-sys-config" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.011009 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-ca" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.011392 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-7plhd" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.013201 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-global-ca" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.028017 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.070266 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.070314 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.070352 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.070525 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.070678 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.070787 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.070844 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/659b6729-41d1-41ef-8b6c-8aacaea8e58a-builder-dockercfg-7plhd-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.070952 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/659b6729-41d1-41ef-8b6c-8aacaea8e58a-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.070984 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/659b6729-41d1-41ef-8b6c-8aacaea8e58a-builder-dockercfg-7plhd-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.071378 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/659b6729-41d1-41ef-8b6c-8aacaea8e58a-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.071461 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.071487 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfks7\" (UniqueName: \"kubernetes.io/projected/659b6729-41d1-41ef-8b6c-8aacaea8e58a-kube-api-access-vfks7\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.173060 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/659b6729-41d1-41ef-8b6c-8aacaea8e58a-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.173456 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.173224 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/659b6729-41d1-41ef-8b6c-8aacaea8e58a-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.173496 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfks7\" (UniqueName: \"kubernetes.io/projected/659b6729-41d1-41ef-8b6c-8aacaea8e58a-kube-api-access-vfks7\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.173705 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.173756 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.173862 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.173939 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.174040 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.174058 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.174151 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.174212 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/659b6729-41d1-41ef-8b6c-8aacaea8e58a-builder-dockercfg-7plhd-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.174245 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.174302 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/659b6729-41d1-41ef-8b6c-8aacaea8e58a-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.174326 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/659b6729-41d1-41ef-8b6c-8aacaea8e58a-builder-dockercfg-7plhd-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.174463 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.174521 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/659b6729-41d1-41ef-8b6c-8aacaea8e58a-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.174537 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.174739 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.175715 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.175797 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.184662 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/659b6729-41d1-41ef-8b6c-8aacaea8e58a-builder-dockercfg-7plhd-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.184745 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/659b6729-41d1-41ef-8b6c-8aacaea8e58a-builder-dockercfg-7plhd-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.192201 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfks7\" (UniqueName: \"kubernetes.io/projected/659b6729-41d1-41ef-8b6c-8aacaea8e58a-kube-api-access-vfks7\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.327952 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:55 crc kubenswrapper[4906]: I0310 00:29:55.832359 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 10 00:29:56 crc kubenswrapper[4906]: I0310 00:29:56.396758 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"659b6729-41d1-41ef-8b6c-8aacaea8e58a","Type":"ContainerStarted","Data":"1a51b6eaa5dbdfbdbd74072c4227a21d56f3cc8f0ef9cc18d94d51ef41e8bd21"} Mar 10 00:29:57 crc kubenswrapper[4906]: I0310 00:29:57.408001 4906 generic.go:334] "Generic (PLEG): container finished" podID="659b6729-41d1-41ef-8b6c-8aacaea8e58a" containerID="c228b9f5968a296b723ea7a7472f74d59f19ed119f2e9fcdf135bdf95bc80716" exitCode=0 Mar 10 00:29:57 crc kubenswrapper[4906]: I0310 00:29:57.408315 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"659b6729-41d1-41ef-8b6c-8aacaea8e58a","Type":"ContainerDied","Data":"c228b9f5968a296b723ea7a7472f74d59f19ed119f2e9fcdf135bdf95bc80716"} Mar 10 00:29:58 crc kubenswrapper[4906]: I0310 00:29:58.420048 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_659b6729-41d1-41ef-8b6c-8aacaea8e58a/docker-build/0.log" Mar 10 00:29:58 crc kubenswrapper[4906]: I0310 00:29:58.420977 4906 generic.go:334] "Generic (PLEG): container finished" podID="659b6729-41d1-41ef-8b6c-8aacaea8e58a" containerID="41236f41b325fe004bc18cc590365d958c7b0b793d4ab9dadfd6d45082642cfa" exitCode=1 Mar 10 00:29:58 crc kubenswrapper[4906]: I0310 00:29:58.421030 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"659b6729-41d1-41ef-8b6c-8aacaea8e58a","Type":"ContainerDied","Data":"41236f41b325fe004bc18cc590365d958c7b0b793d4ab9dadfd6d45082642cfa"} Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.643629 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_659b6729-41d1-41ef-8b6c-8aacaea8e58a/docker-build/0.log" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.644551 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.749273 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/659b6729-41d1-41ef-8b6c-8aacaea8e58a-buildcachedir\") pod \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.749342 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-container-storage-run\") pod \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.749400 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/659b6729-41d1-41ef-8b6c-8aacaea8e58a-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "659b6729-41d1-41ef-8b6c-8aacaea8e58a" (UID: "659b6729-41d1-41ef-8b6c-8aacaea8e58a"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.749423 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-blob-cache\") pod \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.749452 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/659b6729-41d1-41ef-8b6c-8aacaea8e58a-builder-dockercfg-7plhd-pull\") pod \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.749474 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/659b6729-41d1-41ef-8b6c-8aacaea8e58a-node-pullsecrets\") pod \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.749509 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-system-configs\") pod \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.749558 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfks7\" (UniqueName: \"kubernetes.io/projected/659b6729-41d1-41ef-8b6c-8aacaea8e58a-kube-api-access-vfks7\") pod \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.749588 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-proxy-ca-bundles\") pod \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.749597 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/659b6729-41d1-41ef-8b6c-8aacaea8e58a-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "659b6729-41d1-41ef-8b6c-8aacaea8e58a" (UID: "659b6729-41d1-41ef-8b6c-8aacaea8e58a"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.749609 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-buildworkdir\") pod \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.749656 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-container-storage-root\") pod \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.749677 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-ca-bundles\") pod \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.749697 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/659b6729-41d1-41ef-8b6c-8aacaea8e58a-builder-dockercfg-7plhd-push\") pod \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\" (UID: \"659b6729-41d1-41ef-8b6c-8aacaea8e58a\") " Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.749924 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/659b6729-41d1-41ef-8b6c-8aacaea8e58a-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.749936 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/659b6729-41d1-41ef-8b6c-8aacaea8e58a-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.751977 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "659b6729-41d1-41ef-8b6c-8aacaea8e58a" (UID: "659b6729-41d1-41ef-8b6c-8aacaea8e58a"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.752075 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "659b6729-41d1-41ef-8b6c-8aacaea8e58a" (UID: "659b6729-41d1-41ef-8b6c-8aacaea8e58a"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.752679 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "659b6729-41d1-41ef-8b6c-8aacaea8e58a" (UID: "659b6729-41d1-41ef-8b6c-8aacaea8e58a"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.752904 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "659b6729-41d1-41ef-8b6c-8aacaea8e58a" (UID: "659b6729-41d1-41ef-8b6c-8aacaea8e58a"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.752980 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "659b6729-41d1-41ef-8b6c-8aacaea8e58a" (UID: "659b6729-41d1-41ef-8b6c-8aacaea8e58a"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.753187 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "659b6729-41d1-41ef-8b6c-8aacaea8e58a" (UID: "659b6729-41d1-41ef-8b6c-8aacaea8e58a"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.754183 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "659b6729-41d1-41ef-8b6c-8aacaea8e58a" (UID: "659b6729-41d1-41ef-8b6c-8aacaea8e58a"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.755838 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/659b6729-41d1-41ef-8b6c-8aacaea8e58a-builder-dockercfg-7plhd-pull" (OuterVolumeSpecName: "builder-dockercfg-7plhd-pull") pod "659b6729-41d1-41ef-8b6c-8aacaea8e58a" (UID: "659b6729-41d1-41ef-8b6c-8aacaea8e58a"). InnerVolumeSpecName "builder-dockercfg-7plhd-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.756504 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/659b6729-41d1-41ef-8b6c-8aacaea8e58a-builder-dockercfg-7plhd-push" (OuterVolumeSpecName: "builder-dockercfg-7plhd-push") pod "659b6729-41d1-41ef-8b6c-8aacaea8e58a" (UID: "659b6729-41d1-41ef-8b6c-8aacaea8e58a"). InnerVolumeSpecName "builder-dockercfg-7plhd-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.757503 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/659b6729-41d1-41ef-8b6c-8aacaea8e58a-kube-api-access-vfks7" (OuterVolumeSpecName: "kube-api-access-vfks7") pod "659b6729-41d1-41ef-8b6c-8aacaea8e58a" (UID: "659b6729-41d1-41ef-8b6c-8aacaea8e58a"). InnerVolumeSpecName "kube-api-access-vfks7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.851145 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.851180 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/659b6729-41d1-41ef-8b6c-8aacaea8e58a-builder-dockercfg-7plhd-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.851194 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.851207 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfks7\" (UniqueName: \"kubernetes.io/projected/659b6729-41d1-41ef-8b6c-8aacaea8e58a-kube-api-access-vfks7\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.851218 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.851230 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.851242 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.851253 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/659b6729-41d1-41ef-8b6c-8aacaea8e58a-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.851265 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/659b6729-41d1-41ef-8b6c-8aacaea8e58a-builder-dockercfg-7plhd-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:29:59 crc kubenswrapper[4906]: I0310 00:29:59.851275 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/659b6729-41d1-41ef-8b6c-8aacaea8e58a-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.139251 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551710-sgxxm"] Mar 10 00:30:00 crc kubenswrapper[4906]: E0310 00:30:00.139563 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659b6729-41d1-41ef-8b6c-8aacaea8e58a" containerName="docker-build" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.139583 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="659b6729-41d1-41ef-8b6c-8aacaea8e58a" containerName="docker-build" Mar 10 00:30:00 crc kubenswrapper[4906]: E0310 00:30:00.139618 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="659b6729-41d1-41ef-8b6c-8aacaea8e58a" containerName="manage-dockerfile" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.139628 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="659b6729-41d1-41ef-8b6c-8aacaea8e58a" containerName="manage-dockerfile" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.139804 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="659b6729-41d1-41ef-8b6c-8aacaea8e58a" containerName="docker-build" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.140291 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551710-sgxxm" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.142360 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ktdr6" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.143143 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.143735 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.150817 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551710-sgxxm"] Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.240137 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc"] Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.242229 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.244680 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.244871 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.249759 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc"] Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.256446 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6p7k\" (UniqueName: \"kubernetes.io/projected/b331f320-249a-4c86-b3ec-89d0bf6ab0d0-kube-api-access-n6p7k\") pod \"auto-csr-approver-29551710-sgxxm\" (UID: \"b331f320-249a-4c86-b3ec-89d0bf6ab0d0\") " pod="openshift-infra/auto-csr-approver-29551710-sgxxm" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.358559 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d901277-6aca-4488-bd32-21d77c19c467-secret-volume\") pod \"collect-profiles-29551710-hzjvc\" (UID: \"9d901277-6aca-4488-bd32-21d77c19c467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.358840 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6p7k\" (UniqueName: \"kubernetes.io/projected/b331f320-249a-4c86-b3ec-89d0bf6ab0d0-kube-api-access-n6p7k\") pod \"auto-csr-approver-29551710-sgxxm\" (UID: \"b331f320-249a-4c86-b3ec-89d0bf6ab0d0\") " pod="openshift-infra/auto-csr-approver-29551710-sgxxm" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.358970 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d901277-6aca-4488-bd32-21d77c19c467-config-volume\") pod \"collect-profiles-29551710-hzjvc\" (UID: \"9d901277-6aca-4488-bd32-21d77c19c467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.359179 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxxk6\" (UniqueName: \"kubernetes.io/projected/9d901277-6aca-4488-bd32-21d77c19c467-kube-api-access-sxxk6\") pod \"collect-profiles-29551710-hzjvc\" (UID: \"9d901277-6aca-4488-bd32-21d77c19c467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.377260 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6p7k\" (UniqueName: \"kubernetes.io/projected/b331f320-249a-4c86-b3ec-89d0bf6ab0d0-kube-api-access-n6p7k\") pod \"auto-csr-approver-29551710-sgxxm\" (UID: \"b331f320-249a-4c86-b3ec-89d0bf6ab0d0\") " pod="openshift-infra/auto-csr-approver-29551710-sgxxm" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.433873 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_659b6729-41d1-41ef-8b6c-8aacaea8e58a/docker-build/0.log" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.434436 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"659b6729-41d1-41ef-8b6c-8aacaea8e58a","Type":"ContainerDied","Data":"1a51b6eaa5dbdfbdbd74072c4227a21d56f3cc8f0ef9cc18d94d51ef41e8bd21"} Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.434568 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a51b6eaa5dbdfbdbd74072c4227a21d56f3cc8f0ef9cc18d94d51ef41e8bd21" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.434735 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.460358 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d901277-6aca-4488-bd32-21d77c19c467-config-volume\") pod \"collect-profiles-29551710-hzjvc\" (UID: \"9d901277-6aca-4488-bd32-21d77c19c467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.460767 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551710-sgxxm" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.460991 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxxk6\" (UniqueName: \"kubernetes.io/projected/9d901277-6aca-4488-bd32-21d77c19c467-kube-api-access-sxxk6\") pod \"collect-profiles-29551710-hzjvc\" (UID: \"9d901277-6aca-4488-bd32-21d77c19c467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.461697 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d901277-6aca-4488-bd32-21d77c19c467-secret-volume\") pod \"collect-profiles-29551710-hzjvc\" (UID: \"9d901277-6aca-4488-bd32-21d77c19c467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.461810 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d901277-6aca-4488-bd32-21d77c19c467-config-volume\") pod \"collect-profiles-29551710-hzjvc\" (UID: \"9d901277-6aca-4488-bd32-21d77c19c467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.466025 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d901277-6aca-4488-bd32-21d77c19c467-secret-volume\") pod \"collect-profiles-29551710-hzjvc\" (UID: \"9d901277-6aca-4488-bd32-21d77c19c467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.485518 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxxk6\" (UniqueName: \"kubernetes.io/projected/9d901277-6aca-4488-bd32-21d77c19c467-kube-api-access-sxxk6\") pod \"collect-profiles-29551710-hzjvc\" (UID: \"9d901277-6aca-4488-bd32-21d77c19c467\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.557043 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc" Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.804066 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc"] Mar 10 00:30:00 crc kubenswrapper[4906]: W0310 00:30:00.807723 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d901277_6aca_4488_bd32_21d77c19c467.slice/crio-82e0d1c649dfb7285a4626201319242842cc21992c406b78173cb3e65c3780b7 WatchSource:0}: Error finding container 82e0d1c649dfb7285a4626201319242842cc21992c406b78173cb3e65c3780b7: Status 404 returned error can't find the container with id 82e0d1c649dfb7285a4626201319242842cc21992c406b78173cb3e65c3780b7 Mar 10 00:30:00 crc kubenswrapper[4906]: W0310 00:30:00.958991 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb331f320_249a_4c86_b3ec_89d0bf6ab0d0.slice/crio-2b94b5a841892c3ddc6f85307b79eaebb50648d64ad9526a940d7ae80acdcb41 WatchSource:0}: Error finding container 2b94b5a841892c3ddc6f85307b79eaebb50648d64ad9526a940d7ae80acdcb41: Status 404 returned error can't find the container with id 2b94b5a841892c3ddc6f85307b79eaebb50648d64ad9526a940d7ae80acdcb41 Mar 10 00:30:00 crc kubenswrapper[4906]: I0310 00:30:00.955992 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551710-sgxxm"] Mar 10 00:30:01 crc kubenswrapper[4906]: I0310 00:30:01.441616 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551710-sgxxm" event={"ID":"b331f320-249a-4c86-b3ec-89d0bf6ab0d0","Type":"ContainerStarted","Data":"2b94b5a841892c3ddc6f85307b79eaebb50648d64ad9526a940d7ae80acdcb41"} Mar 10 00:30:01 crc kubenswrapper[4906]: I0310 00:30:01.443124 4906 generic.go:334] "Generic (PLEG): container finished" podID="9d901277-6aca-4488-bd32-21d77c19c467" containerID="c87cf5476079944250e254b82f5a35eb0067b9aa7435a88bbada4d2e8ff95121" exitCode=0 Mar 10 00:30:01 crc kubenswrapper[4906]: I0310 00:30:01.443156 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc" event={"ID":"9d901277-6aca-4488-bd32-21d77c19c467","Type":"ContainerDied","Data":"c87cf5476079944250e254b82f5a35eb0067b9aa7435a88bbada4d2e8ff95121"} Mar 10 00:30:01 crc kubenswrapper[4906]: I0310 00:30:01.443180 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc" event={"ID":"9d901277-6aca-4488-bd32-21d77c19c467","Type":"ContainerStarted","Data":"82e0d1c649dfb7285a4626201319242842cc21992c406b78173cb3e65c3780b7"} Mar 10 00:30:02 crc kubenswrapper[4906]: I0310 00:30:02.693426 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc" Mar 10 00:30:02 crc kubenswrapper[4906]: I0310 00:30:02.827429 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d901277-6aca-4488-bd32-21d77c19c467-secret-volume\") pod \"9d901277-6aca-4488-bd32-21d77c19c467\" (UID: \"9d901277-6aca-4488-bd32-21d77c19c467\") " Mar 10 00:30:02 crc kubenswrapper[4906]: I0310 00:30:02.827861 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxxk6\" (UniqueName: \"kubernetes.io/projected/9d901277-6aca-4488-bd32-21d77c19c467-kube-api-access-sxxk6\") pod \"9d901277-6aca-4488-bd32-21d77c19c467\" (UID: \"9d901277-6aca-4488-bd32-21d77c19c467\") " Mar 10 00:30:02 crc kubenswrapper[4906]: I0310 00:30:02.827936 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d901277-6aca-4488-bd32-21d77c19c467-config-volume\") pod \"9d901277-6aca-4488-bd32-21d77c19c467\" (UID: \"9d901277-6aca-4488-bd32-21d77c19c467\") " Mar 10 00:30:02 crc kubenswrapper[4906]: I0310 00:30:02.828828 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d901277-6aca-4488-bd32-21d77c19c467-config-volume" (OuterVolumeSpecName: "config-volume") pod "9d901277-6aca-4488-bd32-21d77c19c467" (UID: "9d901277-6aca-4488-bd32-21d77c19c467"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:02 crc kubenswrapper[4906]: I0310 00:30:02.834788 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d901277-6aca-4488-bd32-21d77c19c467-kube-api-access-sxxk6" (OuterVolumeSpecName: "kube-api-access-sxxk6") pod "9d901277-6aca-4488-bd32-21d77c19c467" (UID: "9d901277-6aca-4488-bd32-21d77c19c467"). InnerVolumeSpecName "kube-api-access-sxxk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:30:02 crc kubenswrapper[4906]: I0310 00:30:02.834801 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d901277-6aca-4488-bd32-21d77c19c467-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9d901277-6aca-4488-bd32-21d77c19c467" (UID: "9d901277-6aca-4488-bd32-21d77c19c467"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:02 crc kubenswrapper[4906]: I0310 00:30:02.929180 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxxk6\" (UniqueName: \"kubernetes.io/projected/9d901277-6aca-4488-bd32-21d77c19c467-kube-api-access-sxxk6\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:02 crc kubenswrapper[4906]: I0310 00:30:02.929219 4906 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d901277-6aca-4488-bd32-21d77c19c467-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:02 crc kubenswrapper[4906]: I0310 00:30:02.929232 4906 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9d901277-6aca-4488-bd32-21d77c19c467-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:03 crc kubenswrapper[4906]: I0310 00:30:03.458993 4906 generic.go:334] "Generic (PLEG): container finished" podID="b331f320-249a-4c86-b3ec-89d0bf6ab0d0" containerID="2c628760c387b1aafef06f0a246a2e5b5d795c623756dfa5334b3ca000207817" exitCode=0 Mar 10 00:30:03 crc kubenswrapper[4906]: I0310 00:30:03.459104 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551710-sgxxm" event={"ID":"b331f320-249a-4c86-b3ec-89d0bf6ab0d0","Type":"ContainerDied","Data":"2c628760c387b1aafef06f0a246a2e5b5d795c623756dfa5334b3ca000207817"} Mar 10 00:30:03 crc kubenswrapper[4906]: I0310 00:30:03.461050 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc" event={"ID":"9d901277-6aca-4488-bd32-21d77c19c467","Type":"ContainerDied","Data":"82e0d1c649dfb7285a4626201319242842cc21992c406b78173cb3e65c3780b7"} Mar 10 00:30:03 crc kubenswrapper[4906]: I0310 00:30:03.461124 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82e0d1c649dfb7285a4626201319242842cc21992c406b78173cb3e65c3780b7" Mar 10 00:30:03 crc kubenswrapper[4906]: I0310 00:30:03.461136 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551710-hzjvc" Mar 10 00:30:03 crc kubenswrapper[4906]: I0310 00:30:03.947890 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wsbqv"] Mar 10 00:30:03 crc kubenswrapper[4906]: E0310 00:30:03.948198 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d901277-6aca-4488-bd32-21d77c19c467" containerName="collect-profiles" Mar 10 00:30:03 crc kubenswrapper[4906]: I0310 00:30:03.948213 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d901277-6aca-4488-bd32-21d77c19c467" containerName="collect-profiles" Mar 10 00:30:03 crc kubenswrapper[4906]: I0310 00:30:03.948355 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d901277-6aca-4488-bd32-21d77c19c467" containerName="collect-profiles" Mar 10 00:30:03 crc kubenswrapper[4906]: I0310 00:30:03.949365 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wsbqv" Mar 10 00:30:03 crc kubenswrapper[4906]: I0310 00:30:03.955293 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wsbqv"] Mar 10 00:30:04 crc kubenswrapper[4906]: I0310 00:30:04.143959 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-catalog-content\") pod \"redhat-operators-wsbqv\" (UID: \"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d\") " pod="openshift-marketplace/redhat-operators-wsbqv" Mar 10 00:30:04 crc kubenswrapper[4906]: I0310 00:30:04.144027 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6vzl\" (UniqueName: \"kubernetes.io/projected/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-kube-api-access-z6vzl\") pod \"redhat-operators-wsbqv\" (UID: \"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d\") " pod="openshift-marketplace/redhat-operators-wsbqv" Mar 10 00:30:04 crc kubenswrapper[4906]: I0310 00:30:04.144059 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-utilities\") pod \"redhat-operators-wsbqv\" (UID: \"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d\") " pod="openshift-marketplace/redhat-operators-wsbqv" Mar 10 00:30:04 crc kubenswrapper[4906]: I0310 00:30:04.245232 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-catalog-content\") pod \"redhat-operators-wsbqv\" (UID: \"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d\") " pod="openshift-marketplace/redhat-operators-wsbqv" Mar 10 00:30:04 crc kubenswrapper[4906]: I0310 00:30:04.245303 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6vzl\" (UniqueName: \"kubernetes.io/projected/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-kube-api-access-z6vzl\") pod \"redhat-operators-wsbqv\" (UID: \"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d\") " pod="openshift-marketplace/redhat-operators-wsbqv" Mar 10 00:30:04 crc kubenswrapper[4906]: I0310 00:30:04.245329 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-utilities\") pod \"redhat-operators-wsbqv\" (UID: \"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d\") " pod="openshift-marketplace/redhat-operators-wsbqv" Mar 10 00:30:04 crc kubenswrapper[4906]: I0310 00:30:04.245790 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-catalog-content\") pod \"redhat-operators-wsbqv\" (UID: \"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d\") " pod="openshift-marketplace/redhat-operators-wsbqv" Mar 10 00:30:04 crc kubenswrapper[4906]: I0310 00:30:04.245850 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-utilities\") pod \"redhat-operators-wsbqv\" (UID: \"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d\") " pod="openshift-marketplace/redhat-operators-wsbqv" Mar 10 00:30:04 crc kubenswrapper[4906]: I0310 00:30:04.268955 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6vzl\" (UniqueName: \"kubernetes.io/projected/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-kube-api-access-z6vzl\") pod \"redhat-operators-wsbqv\" (UID: \"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d\") " pod="openshift-marketplace/redhat-operators-wsbqv" Mar 10 00:30:04 crc kubenswrapper[4906]: I0310 00:30:04.568085 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wsbqv" Mar 10 00:30:04 crc kubenswrapper[4906]: I0310 00:30:04.729947 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551710-sgxxm" Mar 10 00:30:04 crc kubenswrapper[4906]: I0310 00:30:04.751582 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6p7k\" (UniqueName: \"kubernetes.io/projected/b331f320-249a-4c86-b3ec-89d0bf6ab0d0-kube-api-access-n6p7k\") pod \"b331f320-249a-4c86-b3ec-89d0bf6ab0d0\" (UID: \"b331f320-249a-4c86-b3ec-89d0bf6ab0d0\") " Mar 10 00:30:04 crc kubenswrapper[4906]: I0310 00:30:04.759867 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b331f320-249a-4c86-b3ec-89d0bf6ab0d0-kube-api-access-n6p7k" (OuterVolumeSpecName: "kube-api-access-n6p7k") pod "b331f320-249a-4c86-b3ec-89d0bf6ab0d0" (UID: "b331f320-249a-4c86-b3ec-89d0bf6ab0d0"). InnerVolumeSpecName "kube-api-access-n6p7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:30:04 crc kubenswrapper[4906]: I0310 00:30:04.821876 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wsbqv"] Mar 10 00:30:04 crc kubenswrapper[4906]: W0310 00:30:04.828179 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52ab5448_cb0b_4c0b_955b_395f3eb3ca6d.slice/crio-d52796d019dbac7a8b4d7709a75ad7acafe8f11acf5abb10015a3f6e64cbd557 WatchSource:0}: Error finding container d52796d019dbac7a8b4d7709a75ad7acafe8f11acf5abb10015a3f6e64cbd557: Status 404 returned error can't find the container with id d52796d019dbac7a8b4d7709a75ad7acafe8f11acf5abb10015a3f6e64cbd557 Mar 10 00:30:04 crc kubenswrapper[4906]: I0310 00:30:04.856899 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6p7k\" (UniqueName: \"kubernetes.io/projected/b331f320-249a-4c86-b3ec-89d0bf6ab0d0-kube-api-access-n6p7k\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:05 crc kubenswrapper[4906]: I0310 00:30:05.473066 4906 generic.go:334] "Generic (PLEG): container finished" podID="52ab5448-cb0b-4c0b-955b-395f3eb3ca6d" containerID="43029a080da1b023ffd2e9d2a2c4908019f24172926bc6ddabc8c59c2f3948e3" exitCode=0 Mar 10 00:30:05 crc kubenswrapper[4906]: I0310 00:30:05.473396 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wsbqv" event={"ID":"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d","Type":"ContainerDied","Data":"43029a080da1b023ffd2e9d2a2c4908019f24172926bc6ddabc8c59c2f3948e3"} Mar 10 00:30:05 crc kubenswrapper[4906]: I0310 00:30:05.473437 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wsbqv" event={"ID":"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d","Type":"ContainerStarted","Data":"d52796d019dbac7a8b4d7709a75ad7acafe8f11acf5abb10015a3f6e64cbd557"} Mar 10 00:30:05 crc kubenswrapper[4906]: I0310 00:30:05.474927 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551710-sgxxm" event={"ID":"b331f320-249a-4c86-b3ec-89d0bf6ab0d0","Type":"ContainerDied","Data":"2b94b5a841892c3ddc6f85307b79eaebb50648d64ad9526a940d7ae80acdcb41"} Mar 10 00:30:05 crc kubenswrapper[4906]: I0310 00:30:05.474965 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b94b5a841892c3ddc6f85307b79eaebb50648d64ad9526a940d7ae80acdcb41" Mar 10 00:30:05 crc kubenswrapper[4906]: I0310 00:30:05.474984 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551710-sgxxm" Mar 10 00:30:05 crc kubenswrapper[4906]: I0310 00:30:05.513314 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 10 00:30:05 crc kubenswrapper[4906]: I0310 00:30:05.520325 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 10 00:30:05 crc kubenswrapper[4906]: I0310 00:30:05.796387 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551704-cctxp"] Mar 10 00:30:05 crc kubenswrapper[4906]: I0310 00:30:05.802592 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551704-cctxp"] Mar 10 00:30:06 crc kubenswrapper[4906]: I0310 00:30:06.589353 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d53a3cb-cdd6-40c8-9fc8-7b101762bac5" path="/var/lib/kubelet/pods/4d53a3cb-cdd6-40c8-9fc8-7b101762bac5/volumes" Mar 10 00:30:06 crc kubenswrapper[4906]: I0310 00:30:06.590897 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="659b6729-41d1-41ef-8b6c-8aacaea8e58a" path="/var/lib/kubelet/pods/659b6729-41d1-41ef-8b6c-8aacaea8e58a/volumes" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.288874 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 10 00:30:07 crc kubenswrapper[4906]: E0310 00:30:07.289517 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b331f320-249a-4c86-b3ec-89d0bf6ab0d0" containerName="oc" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.289533 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="b331f320-249a-4c86-b3ec-89d0bf6ab0d0" containerName="oc" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.289697 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="b331f320-249a-4c86-b3ec-89d0bf6ab0d0" containerName="oc" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.290492 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.293120 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-ca" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.293481 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-7plhd" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.294544 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-sys-config" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.297051 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-global-ca" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.335030 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.491529 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.491589 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.491622 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-builder-dockercfg-7plhd-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.492043 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.492182 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.492229 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.492262 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srflw\" (UniqueName: \"kubernetes.io/projected/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-kube-api-access-srflw\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.492358 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-builder-dockercfg-7plhd-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.492434 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.492462 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.492506 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.492754 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.493786 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wsbqv" event={"ID":"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d","Type":"ContainerStarted","Data":"de115c5c966a75ad111a49ec27935e17dd0b4838e1f3ca89b3987b0ced8c7c76"} Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.594311 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-builder-dockercfg-7plhd-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.594442 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.594468 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.594488 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.594524 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.594555 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.594589 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.594616 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-builder-dockercfg-7plhd-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.594676 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.594715 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.594738 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.594762 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srflw\" (UniqueName: \"kubernetes.io/projected/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-kube-api-access-srflw\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.596371 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.596698 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.597227 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.597355 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.597491 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.597595 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.597751 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.597756 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.597776 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.605214 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-builder-dockercfg-7plhd-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.605273 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-builder-dockercfg-7plhd-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.615411 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srflw\" (UniqueName: \"kubernetes.io/projected/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-kube-api-access-srflw\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:07 crc kubenswrapper[4906]: I0310 00:30:07.914297 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:08 crc kubenswrapper[4906]: I0310 00:30:08.132490 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 10 00:30:08 crc kubenswrapper[4906]: I0310 00:30:08.501406 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2","Type":"ContainerStarted","Data":"de020838936637f799e44968797c8c205fc23e108f0737e994f6af61f7dd3f10"} Mar 10 00:30:08 crc kubenswrapper[4906]: I0310 00:30:08.501453 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2","Type":"ContainerStarted","Data":"0f406b91f6ea1cab91e60201090e2bc9d14d93e6f7da2874742ab1429ba0799c"} Mar 10 00:30:08 crc kubenswrapper[4906]: I0310 00:30:08.503054 4906 generic.go:334] "Generic (PLEG): container finished" podID="52ab5448-cb0b-4c0b-955b-395f3eb3ca6d" containerID="de115c5c966a75ad111a49ec27935e17dd0b4838e1f3ca89b3987b0ced8c7c76" exitCode=0 Mar 10 00:30:08 crc kubenswrapper[4906]: I0310 00:30:08.503091 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wsbqv" event={"ID":"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d","Type":"ContainerDied","Data":"de115c5c966a75ad111a49ec27935e17dd0b4838e1f3ca89b3987b0ced8c7c76"} Mar 10 00:30:09 crc kubenswrapper[4906]: I0310 00:30:09.511608 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wsbqv" event={"ID":"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d","Type":"ContainerStarted","Data":"5a261b191d74b856af707f017e3b21aef910e308389ca6201febb6ece342800d"} Mar 10 00:30:09 crc kubenswrapper[4906]: I0310 00:30:09.513701 4906 generic.go:334] "Generic (PLEG): container finished" podID="0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" containerID="de020838936637f799e44968797c8c205fc23e108f0737e994f6af61f7dd3f10" exitCode=0 Mar 10 00:30:09 crc kubenswrapper[4906]: I0310 00:30:09.513734 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2","Type":"ContainerDied","Data":"de020838936637f799e44968797c8c205fc23e108f0737e994f6af61f7dd3f10"} Mar 10 00:30:09 crc kubenswrapper[4906]: I0310 00:30:09.604559 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wsbqv" podStartSLOduration=3.158183173 podStartE2EDuration="6.604541657s" podCreationTimestamp="2026-03-10 00:30:03 +0000 UTC" firstStartedPulling="2026-03-10 00:30:05.474788267 +0000 UTC m=+1431.622683379" lastFinishedPulling="2026-03-10 00:30:08.921146751 +0000 UTC m=+1435.069041863" observedRunningTime="2026-03-10 00:30:09.602250623 +0000 UTC m=+1435.750145735" watchObservedRunningTime="2026-03-10 00:30:09.604541657 +0000 UTC m=+1435.752436769" Mar 10 00:30:10 crc kubenswrapper[4906]: I0310 00:30:10.523766 4906 generic.go:334] "Generic (PLEG): container finished" podID="0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" containerID="22993634b00f53b8a87d1cc6f3e3348ecdbbdb045de45ba241d9932e76bf5200" exitCode=0 Mar 10 00:30:10 crc kubenswrapper[4906]: I0310 00:30:10.523944 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2","Type":"ContainerDied","Data":"22993634b00f53b8a87d1cc6f3e3348ecdbbdb045de45ba241d9932e76bf5200"} Mar 10 00:30:10 crc kubenswrapper[4906]: I0310 00:30:10.575745 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-2-build_0caa792c-fc50-4bd9-b76b-93c6d4ff33d2/manage-dockerfile/0.log" Mar 10 00:30:11 crc kubenswrapper[4906]: I0310 00:30:11.534277 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2","Type":"ContainerStarted","Data":"48dc7ad89441c62b5e8cbb45a9219fa7f4ebfc0e9fa3999d0d1bde84e0613287"} Mar 10 00:30:11 crc kubenswrapper[4906]: I0310 00:30:11.578203 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-2-build" podStartSLOduration=4.578179072 podStartE2EDuration="4.578179072s" podCreationTimestamp="2026-03-10 00:30:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:30:11.56885411 +0000 UTC m=+1437.716749232" watchObservedRunningTime="2026-03-10 00:30:11.578179072 +0000 UTC m=+1437.726074194" Mar 10 00:30:14 crc kubenswrapper[4906]: I0310 00:30:14.569374 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wsbqv" Mar 10 00:30:14 crc kubenswrapper[4906]: I0310 00:30:14.569782 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wsbqv" Mar 10 00:30:15 crc kubenswrapper[4906]: I0310 00:30:15.625915 4906 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wsbqv" podUID="52ab5448-cb0b-4c0b-955b-395f3eb3ca6d" containerName="registry-server" probeResult="failure" output=< Mar 10 00:30:15 crc kubenswrapper[4906]: timeout: failed to connect service ":50051" within 1s Mar 10 00:30:15 crc kubenswrapper[4906]: > Mar 10 00:30:16 crc kubenswrapper[4906]: I0310 00:30:16.570697 4906 generic.go:334] "Generic (PLEG): container finished" podID="0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" containerID="48dc7ad89441c62b5e8cbb45a9219fa7f4ebfc0e9fa3999d0d1bde84e0613287" exitCode=0 Mar 10 00:30:16 crc kubenswrapper[4906]: I0310 00:30:16.570749 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2","Type":"ContainerDied","Data":"48dc7ad89441c62b5e8cbb45a9219fa7f4ebfc0e9fa3999d0d1bde84e0613287"} Mar 10 00:30:17 crc kubenswrapper[4906]: I0310 00:30:17.897042 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.030258 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-node-pullsecrets\") pod \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.030329 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-builder-dockercfg-7plhd-pull\") pod \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.030362 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-system-configs\") pod \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.030371 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" (UID: "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.030402 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-ca-bundles\") pod \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.030515 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srflw\" (UniqueName: \"kubernetes.io/projected/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-kube-api-access-srflw\") pod \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.030567 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-proxy-ca-bundles\") pod \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.030606 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-builder-dockercfg-7plhd-push\") pod \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.030685 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-container-storage-root\") pod \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.030723 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-buildcachedir\") pod \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.030804 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-blob-cache\") pod \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.030841 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-buildworkdir\") pod \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.030892 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-container-storage-run\") pod \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\" (UID: \"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2\") " Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.031257 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" (UID: "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.031376 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.031392 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.032109 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" (UID: "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.032543 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" (UID: "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.032598 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" (UID: "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.033003 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" (UID: "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.033497 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" (UID: "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.034724 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" (UID: "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.037301 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-builder-dockercfg-7plhd-pull" (OuterVolumeSpecName: "builder-dockercfg-7plhd-pull") pod "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" (UID: "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2"). InnerVolumeSpecName "builder-dockercfg-7plhd-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.037435 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-builder-dockercfg-7plhd-push" (OuterVolumeSpecName: "builder-dockercfg-7plhd-push") pod "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" (UID: "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2"). InnerVolumeSpecName "builder-dockercfg-7plhd-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.038196 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-kube-api-access-srflw" (OuterVolumeSpecName: "kube-api-access-srflw") pod "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" (UID: "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2"). InnerVolumeSpecName "kube-api-access-srflw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.039477 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" (UID: "0caa792c-fc50-4bd9-b76b-93c6d4ff33d2"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.132523 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.132972 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srflw\" (UniqueName: \"kubernetes.io/projected/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-kube-api-access-srflw\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.133130 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.133275 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-builder-dockercfg-7plhd-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.133433 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.133610 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.133812 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.133942 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.134054 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.134176 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/0caa792c-fc50-4bd9-b76b-93c6d4ff33d2-builder-dockercfg-7plhd-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.594469 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.594474 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"0caa792c-fc50-4bd9-b76b-93c6d4ff33d2","Type":"ContainerDied","Data":"0f406b91f6ea1cab91e60201090e2bc9d14d93e6f7da2874742ab1429ba0799c"} Mar 10 00:30:18 crc kubenswrapper[4906]: I0310 00:30:18.595167 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f406b91f6ea1cab91e60201090e2bc9d14d93e6f7da2874742ab1429ba0799c" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.823183 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 10 00:30:21 crc kubenswrapper[4906]: E0310 00:30:21.823701 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" containerName="docker-build" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.823715 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" containerName="docker-build" Mar 10 00:30:21 crc kubenswrapper[4906]: E0310 00:30:21.823730 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" containerName="manage-dockerfile" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.823738 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" containerName="manage-dockerfile" Mar 10 00:30:21 crc kubenswrapper[4906]: E0310 00:30:21.823755 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" containerName="git-clone" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.823764 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" containerName="git-clone" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.823904 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="0caa792c-fc50-4bd9-b76b-93c6d4ff33d2" containerName="docker-build" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.824589 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.826127 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-sys-config" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.827532 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-ca" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.828882 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-7plhd" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.829102 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-global-ca" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.842744 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.990189 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.990534 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/714acfb8-6728-4c52-9c24-fb13ddf3fa80-builder-dockercfg-7plhd-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.990698 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.990828 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/714acfb8-6728-4c52-9c24-fb13ddf3fa80-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.990953 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/714acfb8-6728-4c52-9c24-fb13ddf3fa80-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.991075 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.991237 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/714acfb8-6728-4c52-9c24-fb13ddf3fa80-builder-dockercfg-7plhd-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.991376 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kw7c\" (UniqueName: \"kubernetes.io/projected/714acfb8-6728-4c52-9c24-fb13ddf3fa80-kube-api-access-6kw7c\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.991529 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.991729 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.991894 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:21 crc kubenswrapper[4906]: I0310 00:30:21.992058 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.093529 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/714acfb8-6728-4c52-9c24-fb13ddf3fa80-builder-dockercfg-7plhd-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.093614 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.093706 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/714acfb8-6728-4c52-9c24-fb13ddf3fa80-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.093752 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/714acfb8-6728-4c52-9c24-fb13ddf3fa80-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.093795 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.093840 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/714acfb8-6728-4c52-9c24-fb13ddf3fa80-builder-dockercfg-7plhd-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.093870 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kw7c\" (UniqueName: \"kubernetes.io/projected/714acfb8-6728-4c52-9c24-fb13ddf3fa80-kube-api-access-6kw7c\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.093902 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.093957 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.094005 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.094049 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.094118 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.094852 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/714acfb8-6728-4c52-9c24-fb13ddf3fa80-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.094893 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/714acfb8-6728-4c52-9c24-fb13ddf3fa80-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.095438 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.095713 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.095706 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.095858 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.095887 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.096285 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.096559 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.100677 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/714acfb8-6728-4c52-9c24-fb13ddf3fa80-builder-dockercfg-7plhd-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.102408 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/714acfb8-6728-4c52-9c24-fb13ddf3fa80-builder-dockercfg-7plhd-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.110201 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kw7c\" (UniqueName: \"kubernetes.io/projected/714acfb8-6728-4c52-9c24-fb13ddf3fa80-kube-api-access-6kw7c\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.115980 4906 scope.go:117] "RemoveContainer" containerID="41937946e05a86d965f7df64848df0deadba1152c147422834140e5f87cae877" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.143014 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:22 crc kubenswrapper[4906]: I0310 00:30:22.650038 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 10 00:30:23 crc kubenswrapper[4906]: I0310 00:30:23.637302 4906 generic.go:334] "Generic (PLEG): container finished" podID="714acfb8-6728-4c52-9c24-fb13ddf3fa80" containerID="80f9436bc15a06cf11da9ef27ffcbf8edfb89546604e89737143d24c8f3d8f63" exitCode=0 Mar 10 00:30:23 crc kubenswrapper[4906]: I0310 00:30:23.637387 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"714acfb8-6728-4c52-9c24-fb13ddf3fa80","Type":"ContainerDied","Data":"80f9436bc15a06cf11da9ef27ffcbf8edfb89546604e89737143d24c8f3d8f63"} Mar 10 00:30:23 crc kubenswrapper[4906]: I0310 00:30:23.637653 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"714acfb8-6728-4c52-9c24-fb13ddf3fa80","Type":"ContainerStarted","Data":"800547433bad351d5175a36911d59cdf133504a593734889dcba528ef198d62a"} Mar 10 00:30:24 crc kubenswrapper[4906]: I0310 00:30:24.647028 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wsbqv" Mar 10 00:30:24 crc kubenswrapper[4906]: I0310 00:30:24.648809 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_714acfb8-6728-4c52-9c24-fb13ddf3fa80/docker-build/0.log" Mar 10 00:30:24 crc kubenswrapper[4906]: I0310 00:30:24.650463 4906 generic.go:334] "Generic (PLEG): container finished" podID="714acfb8-6728-4c52-9c24-fb13ddf3fa80" containerID="c2765d137969e172edca3da8241f8c85f1f41730aa9e2b06f5ebc6ab6d39c687" exitCode=1 Mar 10 00:30:24 crc kubenswrapper[4906]: I0310 00:30:24.650546 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"714acfb8-6728-4c52-9c24-fb13ddf3fa80","Type":"ContainerDied","Data":"c2765d137969e172edca3da8241f8c85f1f41730aa9e2b06f5ebc6ab6d39c687"} Mar 10 00:30:24 crc kubenswrapper[4906]: I0310 00:30:24.704923 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wsbqv" Mar 10 00:30:25 crc kubenswrapper[4906]: I0310 00:30:25.137125 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wsbqv"] Mar 10 00:30:25 crc kubenswrapper[4906]: I0310 00:30:25.932358 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_714acfb8-6728-4c52-9c24-fb13ddf3fa80/docker-build/0.log" Mar 10 00:30:25 crc kubenswrapper[4906]: I0310 00:30:25.933909 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.073934 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-container-storage-root\") pod \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.073978 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-container-storage-run\") pod \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.074026 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/714acfb8-6728-4c52-9c24-fb13ddf3fa80-buildcachedir\") pod \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.074058 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-system-configs\") pod \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.074077 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/714acfb8-6728-4c52-9c24-fb13ddf3fa80-node-pullsecrets\") pod \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.074120 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/714acfb8-6728-4c52-9c24-fb13ddf3fa80-builder-dockercfg-7plhd-push\") pod \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.074148 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kw7c\" (UniqueName: \"kubernetes.io/projected/714acfb8-6728-4c52-9c24-fb13ddf3fa80-kube-api-access-6kw7c\") pod \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.074169 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-blob-cache\") pod \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.074246 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-buildworkdir\") pod \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.074271 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-proxy-ca-bundles\") pod \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.074292 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-ca-bundles\") pod \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.074341 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/714acfb8-6728-4c52-9c24-fb13ddf3fa80-builder-dockercfg-7plhd-pull\") pod \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\" (UID: \"714acfb8-6728-4c52-9c24-fb13ddf3fa80\") " Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.074285 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/714acfb8-6728-4c52-9c24-fb13ddf3fa80-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "714acfb8-6728-4c52-9c24-fb13ddf3fa80" (UID: "714acfb8-6728-4c52-9c24-fb13ddf3fa80"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.074310 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/714acfb8-6728-4c52-9c24-fb13ddf3fa80-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "714acfb8-6728-4c52-9c24-fb13ddf3fa80" (UID: "714acfb8-6728-4c52-9c24-fb13ddf3fa80"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.074923 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "714acfb8-6728-4c52-9c24-fb13ddf3fa80" (UID: "714acfb8-6728-4c52-9c24-fb13ddf3fa80"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.075163 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "714acfb8-6728-4c52-9c24-fb13ddf3fa80" (UID: "714acfb8-6728-4c52-9c24-fb13ddf3fa80"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.075444 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "714acfb8-6728-4c52-9c24-fb13ddf3fa80" (UID: "714acfb8-6728-4c52-9c24-fb13ddf3fa80"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.075505 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "714acfb8-6728-4c52-9c24-fb13ddf3fa80" (UID: "714acfb8-6728-4c52-9c24-fb13ddf3fa80"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.075559 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "714acfb8-6728-4c52-9c24-fb13ddf3fa80" (UID: "714acfb8-6728-4c52-9c24-fb13ddf3fa80"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.076297 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "714acfb8-6728-4c52-9c24-fb13ddf3fa80" (UID: "714acfb8-6728-4c52-9c24-fb13ddf3fa80"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.076331 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "714acfb8-6728-4c52-9c24-fb13ddf3fa80" (UID: "714acfb8-6728-4c52-9c24-fb13ddf3fa80"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.079480 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/714acfb8-6728-4c52-9c24-fb13ddf3fa80-builder-dockercfg-7plhd-pull" (OuterVolumeSpecName: "builder-dockercfg-7plhd-pull") pod "714acfb8-6728-4c52-9c24-fb13ddf3fa80" (UID: "714acfb8-6728-4c52-9c24-fb13ddf3fa80"). InnerVolumeSpecName "builder-dockercfg-7plhd-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.079788 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/714acfb8-6728-4c52-9c24-fb13ddf3fa80-builder-dockercfg-7plhd-push" (OuterVolumeSpecName: "builder-dockercfg-7plhd-push") pod "714acfb8-6728-4c52-9c24-fb13ddf3fa80" (UID: "714acfb8-6728-4c52-9c24-fb13ddf3fa80"). InnerVolumeSpecName "builder-dockercfg-7plhd-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.080085 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/714acfb8-6728-4c52-9c24-fb13ddf3fa80-kube-api-access-6kw7c" (OuterVolumeSpecName: "kube-api-access-6kw7c") pod "714acfb8-6728-4c52-9c24-fb13ddf3fa80" (UID: "714acfb8-6728-4c52-9c24-fb13ddf3fa80"). InnerVolumeSpecName "kube-api-access-6kw7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.176036 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kw7c\" (UniqueName: \"kubernetes.io/projected/714acfb8-6728-4c52-9c24-fb13ddf3fa80-kube-api-access-6kw7c\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.176093 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.176115 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.176136 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.176155 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.176174 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/714acfb8-6728-4c52-9c24-fb13ddf3fa80-builder-dockercfg-7plhd-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.176193 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.176212 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/714acfb8-6728-4c52-9c24-fb13ddf3fa80-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.176230 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/714acfb8-6728-4c52-9c24-fb13ddf3fa80-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.176248 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/714acfb8-6728-4c52-9c24-fb13ddf3fa80-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.176271 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/714acfb8-6728-4c52-9c24-fb13ddf3fa80-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.176292 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/714acfb8-6728-4c52-9c24-fb13ddf3fa80-builder-dockercfg-7plhd-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.672398 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_714acfb8-6728-4c52-9c24-fb13ddf3fa80/docker-build/0.log" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.673841 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"714acfb8-6728-4c52-9c24-fb13ddf3fa80","Type":"ContainerDied","Data":"800547433bad351d5175a36911d59cdf133504a593734889dcba528ef198d62a"} Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.673904 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="800547433bad351d5175a36911d59cdf133504a593734889dcba528ef198d62a" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.673936 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 10 00:30:26 crc kubenswrapper[4906]: I0310 00:30:26.673945 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wsbqv" podUID="52ab5448-cb0b-4c0b-955b-395f3eb3ca6d" containerName="registry-server" containerID="cri-o://5a261b191d74b856af707f017e3b21aef910e308389ca6201febb6ece342800d" gracePeriod=2 Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.065052 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wsbqv" Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.190133 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-catalog-content\") pod \"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d\" (UID: \"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d\") " Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.190556 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6vzl\" (UniqueName: \"kubernetes.io/projected/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-kube-api-access-z6vzl\") pod \"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d\" (UID: \"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d\") " Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.190666 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-utilities\") pod \"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d\" (UID: \"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d\") " Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.191564 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-utilities" (OuterVolumeSpecName: "utilities") pod "52ab5448-cb0b-4c0b-955b-395f3eb3ca6d" (UID: "52ab5448-cb0b-4c0b-955b-395f3eb3ca6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.198113 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-kube-api-access-z6vzl" (OuterVolumeSpecName: "kube-api-access-z6vzl") pod "52ab5448-cb0b-4c0b-955b-395f3eb3ca6d" (UID: "52ab5448-cb0b-4c0b-955b-395f3eb3ca6d"). InnerVolumeSpecName "kube-api-access-z6vzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.292615 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6vzl\" (UniqueName: \"kubernetes.io/projected/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-kube-api-access-z6vzl\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.292733 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.363073 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52ab5448-cb0b-4c0b-955b-395f3eb3ca6d" (UID: "52ab5448-cb0b-4c0b-955b-395f3eb3ca6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.394080 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.685454 4906 generic.go:334] "Generic (PLEG): container finished" podID="52ab5448-cb0b-4c0b-955b-395f3eb3ca6d" containerID="5a261b191d74b856af707f017e3b21aef910e308389ca6201febb6ece342800d" exitCode=0 Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.685514 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wsbqv" event={"ID":"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d","Type":"ContainerDied","Data":"5a261b191d74b856af707f017e3b21aef910e308389ca6201febb6ece342800d"} Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.685556 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wsbqv" event={"ID":"52ab5448-cb0b-4c0b-955b-395f3eb3ca6d","Type":"ContainerDied","Data":"d52796d019dbac7a8b4d7709a75ad7acafe8f11acf5abb10015a3f6e64cbd557"} Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.685586 4906 scope.go:117] "RemoveContainer" containerID="5a261b191d74b856af707f017e3b21aef910e308389ca6201febb6ece342800d" Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.685578 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wsbqv" Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.713118 4906 scope.go:117] "RemoveContainer" containerID="de115c5c966a75ad111a49ec27935e17dd0b4838e1f3ca89b3987b0ced8c7c76" Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.744979 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wsbqv"] Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.774307 4906 scope.go:117] "RemoveContainer" containerID="43029a080da1b023ffd2e9d2a2c4908019f24172926bc6ddabc8c59c2f3948e3" Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.775356 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wsbqv"] Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.797158 4906 scope.go:117] "RemoveContainer" containerID="5a261b191d74b856af707f017e3b21aef910e308389ca6201febb6ece342800d" Mar 10 00:30:27 crc kubenswrapper[4906]: E0310 00:30:27.797673 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a261b191d74b856af707f017e3b21aef910e308389ca6201febb6ece342800d\": container with ID starting with 5a261b191d74b856af707f017e3b21aef910e308389ca6201febb6ece342800d not found: ID does not exist" containerID="5a261b191d74b856af707f017e3b21aef910e308389ca6201febb6ece342800d" Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.797729 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a261b191d74b856af707f017e3b21aef910e308389ca6201febb6ece342800d"} err="failed to get container status \"5a261b191d74b856af707f017e3b21aef910e308389ca6201febb6ece342800d\": rpc error: code = NotFound desc = could not find container \"5a261b191d74b856af707f017e3b21aef910e308389ca6201febb6ece342800d\": container with ID starting with 5a261b191d74b856af707f017e3b21aef910e308389ca6201febb6ece342800d not found: ID does not exist" Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.797763 4906 scope.go:117] "RemoveContainer" containerID="de115c5c966a75ad111a49ec27935e17dd0b4838e1f3ca89b3987b0ced8c7c76" Mar 10 00:30:27 crc kubenswrapper[4906]: E0310 00:30:27.799694 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de115c5c966a75ad111a49ec27935e17dd0b4838e1f3ca89b3987b0ced8c7c76\": container with ID starting with de115c5c966a75ad111a49ec27935e17dd0b4838e1f3ca89b3987b0ced8c7c76 not found: ID does not exist" containerID="de115c5c966a75ad111a49ec27935e17dd0b4838e1f3ca89b3987b0ced8c7c76" Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.799740 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de115c5c966a75ad111a49ec27935e17dd0b4838e1f3ca89b3987b0ced8c7c76"} err="failed to get container status \"de115c5c966a75ad111a49ec27935e17dd0b4838e1f3ca89b3987b0ced8c7c76\": rpc error: code = NotFound desc = could not find container \"de115c5c966a75ad111a49ec27935e17dd0b4838e1f3ca89b3987b0ced8c7c76\": container with ID starting with de115c5c966a75ad111a49ec27935e17dd0b4838e1f3ca89b3987b0ced8c7c76 not found: ID does not exist" Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.799762 4906 scope.go:117] "RemoveContainer" containerID="43029a080da1b023ffd2e9d2a2c4908019f24172926bc6ddabc8c59c2f3948e3" Mar 10 00:30:27 crc kubenswrapper[4906]: E0310 00:30:27.800166 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43029a080da1b023ffd2e9d2a2c4908019f24172926bc6ddabc8c59c2f3948e3\": container with ID starting with 43029a080da1b023ffd2e9d2a2c4908019f24172926bc6ddabc8c59c2f3948e3 not found: ID does not exist" containerID="43029a080da1b023ffd2e9d2a2c4908019f24172926bc6ddabc8c59c2f3948e3" Mar 10 00:30:27 crc kubenswrapper[4906]: I0310 00:30:27.800211 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43029a080da1b023ffd2e9d2a2c4908019f24172926bc6ddabc8c59c2f3948e3"} err="failed to get container status \"43029a080da1b023ffd2e9d2a2c4908019f24172926bc6ddabc8c59c2f3948e3\": rpc error: code = NotFound desc = could not find container \"43029a080da1b023ffd2e9d2a2c4908019f24172926bc6ddabc8c59c2f3948e3\": container with ID starting with 43029a080da1b023ffd2e9d2a2c4908019f24172926bc6ddabc8c59c2f3948e3 not found: ID does not exist" Mar 10 00:30:28 crc kubenswrapper[4906]: I0310 00:30:28.592866 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52ab5448-cb0b-4c0b-955b-395f3eb3ca6d" path="/var/lib/kubelet/pods/52ab5448-cb0b-4c0b-955b-395f3eb3ca6d/volumes" Mar 10 00:30:32 crc kubenswrapper[4906]: I0310 00:30:32.778812 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 10 00:30:32 crc kubenswrapper[4906]: I0310 00:30:32.785903 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.393290 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 10 00:30:34 crc kubenswrapper[4906]: E0310 00:30:34.393999 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52ab5448-cb0b-4c0b-955b-395f3eb3ca6d" containerName="registry-server" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.394020 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="52ab5448-cb0b-4c0b-955b-395f3eb3ca6d" containerName="registry-server" Mar 10 00:30:34 crc kubenswrapper[4906]: E0310 00:30:34.394036 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52ab5448-cb0b-4c0b-955b-395f3eb3ca6d" containerName="extract-content" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.394049 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="52ab5448-cb0b-4c0b-955b-395f3eb3ca6d" containerName="extract-content" Mar 10 00:30:34 crc kubenswrapper[4906]: E0310 00:30:34.394085 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52ab5448-cb0b-4c0b-955b-395f3eb3ca6d" containerName="extract-utilities" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.394101 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="52ab5448-cb0b-4c0b-955b-395f3eb3ca6d" containerName="extract-utilities" Mar 10 00:30:34 crc kubenswrapper[4906]: E0310 00:30:34.394121 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="714acfb8-6728-4c52-9c24-fb13ddf3fa80" containerName="manage-dockerfile" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.394132 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="714acfb8-6728-4c52-9c24-fb13ddf3fa80" containerName="manage-dockerfile" Mar 10 00:30:34 crc kubenswrapper[4906]: E0310 00:30:34.394150 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="714acfb8-6728-4c52-9c24-fb13ddf3fa80" containerName="docker-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.394163 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="714acfb8-6728-4c52-9c24-fb13ddf3fa80" containerName="docker-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.394362 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="52ab5448-cb0b-4c0b-955b-395f3eb3ca6d" containerName="registry-server" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.394384 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="714acfb8-6728-4c52-9c24-fb13ddf3fa80" containerName="docker-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.395768 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.397973 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-sys-config" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.398166 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-7plhd" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.398107 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-global-ca" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.398668 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-ca" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.417771 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.493781 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/a2212ea3-6e6f-4f45-b594-95865bbb939b-builder-dockercfg-7plhd-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.493818 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.493837 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.494003 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.494075 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.494232 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.494294 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a2212ea3-6e6f-4f45-b594-95865bbb939b-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.494324 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a2212ea3-6e6f-4f45-b594-95865bbb939b-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.494353 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcd2m\" (UniqueName: \"kubernetes.io/projected/a2212ea3-6e6f-4f45-b594-95865bbb939b-kube-api-access-wcd2m\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.494385 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.494412 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/a2212ea3-6e6f-4f45-b594-95865bbb939b-builder-dockercfg-7plhd-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.494436 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.591423 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="714acfb8-6728-4c52-9c24-fb13ddf3fa80" path="/var/lib/kubelet/pods/714acfb8-6728-4c52-9c24-fb13ddf3fa80/volumes" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.595164 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a2212ea3-6e6f-4f45-b594-95865bbb939b-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.595299 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcd2m\" (UniqueName: \"kubernetes.io/projected/a2212ea3-6e6f-4f45-b594-95865bbb939b-kube-api-access-wcd2m\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.595400 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.595518 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/a2212ea3-6e6f-4f45-b594-95865bbb939b-builder-dockercfg-7plhd-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.595625 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.595826 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a2212ea3-6e6f-4f45-b594-95865bbb939b-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.596029 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/a2212ea3-6e6f-4f45-b594-95865bbb939b-builder-dockercfg-7plhd-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.596102 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.596109 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.596189 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.596278 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.596314 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.596457 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.596506 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a2212ea3-6e6f-4f45-b594-95865bbb939b-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.596579 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a2212ea3-6e6f-4f45-b594-95865bbb939b-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.596660 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.596872 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.597022 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.597162 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.597542 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.597735 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.601803 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/a2212ea3-6e6f-4f45-b594-95865bbb939b-builder-dockercfg-7plhd-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.603256 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/a2212ea3-6e6f-4f45-b594-95865bbb939b-builder-dockercfg-7plhd-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.634093 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcd2m\" (UniqueName: \"kubernetes.io/projected/a2212ea3-6e6f-4f45-b594-95865bbb939b-kube-api-access-wcd2m\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.718978 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:34 crc kubenswrapper[4906]: I0310 00:30:34.974088 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 10 00:30:35 crc kubenswrapper[4906]: I0310 00:30:35.764061 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"a2212ea3-6e6f-4f45-b594-95865bbb939b","Type":"ContainerStarted","Data":"fcc130d23c0f48e62a5cc6d7e7ee87d90496f14a3b1924c3d353f16663340c84"} Mar 10 00:30:35 crc kubenswrapper[4906]: I0310 00:30:35.764401 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"a2212ea3-6e6f-4f45-b594-95865bbb939b","Type":"ContainerStarted","Data":"3587bd1dab9e5ec780ec4e3356d26a6f65583b8d116cc504b3d521c0f193fd3c"} Mar 10 00:30:36 crc kubenswrapper[4906]: I0310 00:30:36.771545 4906 generic.go:334] "Generic (PLEG): container finished" podID="a2212ea3-6e6f-4f45-b594-95865bbb939b" containerID="fcc130d23c0f48e62a5cc6d7e7ee87d90496f14a3b1924c3d353f16663340c84" exitCode=0 Mar 10 00:30:36 crc kubenswrapper[4906]: I0310 00:30:36.771583 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"a2212ea3-6e6f-4f45-b594-95865bbb939b","Type":"ContainerDied","Data":"fcc130d23c0f48e62a5cc6d7e7ee87d90496f14a3b1924c3d353f16663340c84"} Mar 10 00:30:37 crc kubenswrapper[4906]: I0310 00:30:37.791372 4906 generic.go:334] "Generic (PLEG): container finished" podID="a2212ea3-6e6f-4f45-b594-95865bbb939b" containerID="7062cafcd77ae52b109cc915b995f59405741c253379c48acf500229242922ad" exitCode=0 Mar 10 00:30:37 crc kubenswrapper[4906]: I0310 00:30:37.791744 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"a2212ea3-6e6f-4f45-b594-95865bbb939b","Type":"ContainerDied","Data":"7062cafcd77ae52b109cc915b995f59405741c253379c48acf500229242922ad"} Mar 10 00:30:37 crc kubenswrapper[4906]: I0310 00:30:37.841758 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-2-build_a2212ea3-6e6f-4f45-b594-95865bbb939b/manage-dockerfile/0.log" Mar 10 00:30:38 crc kubenswrapper[4906]: I0310 00:30:38.828515 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"a2212ea3-6e6f-4f45-b594-95865bbb939b","Type":"ContainerStarted","Data":"a85f188e852f209d29fa397480554ff51970466120510e8634adf01c0d28c6d5"} Mar 10 00:30:38 crc kubenswrapper[4906]: I0310 00:30:38.878147 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-bundle-2-build" podStartSLOduration=4.878122132 podStartE2EDuration="4.878122132s" podCreationTimestamp="2026-03-10 00:30:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:30:38.872039012 +0000 UTC m=+1465.019934164" watchObservedRunningTime="2026-03-10 00:30:38.878122132 +0000 UTC m=+1465.026017274" Mar 10 00:30:41 crc kubenswrapper[4906]: I0310 00:30:41.868126 4906 generic.go:334] "Generic (PLEG): container finished" podID="a2212ea3-6e6f-4f45-b594-95865bbb939b" containerID="a85f188e852f209d29fa397480554ff51970466120510e8634adf01c0d28c6d5" exitCode=0 Mar 10 00:30:41 crc kubenswrapper[4906]: I0310 00:30:41.868248 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"a2212ea3-6e6f-4f45-b594-95865bbb939b","Type":"ContainerDied","Data":"a85f188e852f209d29fa397480554ff51970466120510e8634adf01c0d28c6d5"} Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.121016 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.225162 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-proxy-ca-bundles\") pod \"a2212ea3-6e6f-4f45-b594-95865bbb939b\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.225228 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-buildworkdir\") pod \"a2212ea3-6e6f-4f45-b594-95865bbb939b\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.225264 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-blob-cache\") pod \"a2212ea3-6e6f-4f45-b594-95865bbb939b\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.225300 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a2212ea3-6e6f-4f45-b594-95865bbb939b-node-pullsecrets\") pod \"a2212ea3-6e6f-4f45-b594-95865bbb939b\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.225327 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-container-storage-run\") pod \"a2212ea3-6e6f-4f45-b594-95865bbb939b\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.225356 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/a2212ea3-6e6f-4f45-b594-95865bbb939b-builder-dockercfg-7plhd-push\") pod \"a2212ea3-6e6f-4f45-b594-95865bbb939b\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.225380 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-ca-bundles\") pod \"a2212ea3-6e6f-4f45-b594-95865bbb939b\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.225405 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-container-storage-root\") pod \"a2212ea3-6e6f-4f45-b594-95865bbb939b\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.225409 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2212ea3-6e6f-4f45-b594-95865bbb939b-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "a2212ea3-6e6f-4f45-b594-95865bbb939b" (UID: "a2212ea3-6e6f-4f45-b594-95865bbb939b"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.225443 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcd2m\" (UniqueName: \"kubernetes.io/projected/a2212ea3-6e6f-4f45-b594-95865bbb939b-kube-api-access-wcd2m\") pod \"a2212ea3-6e6f-4f45-b594-95865bbb939b\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.225495 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a2212ea3-6e6f-4f45-b594-95865bbb939b-buildcachedir\") pod \"a2212ea3-6e6f-4f45-b594-95865bbb939b\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.225519 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/a2212ea3-6e6f-4f45-b594-95865bbb939b-builder-dockercfg-7plhd-pull\") pod \"a2212ea3-6e6f-4f45-b594-95865bbb939b\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.225546 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-system-configs\") pod \"a2212ea3-6e6f-4f45-b594-95865bbb939b\" (UID: \"a2212ea3-6e6f-4f45-b594-95865bbb939b\") " Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.225568 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2212ea3-6e6f-4f45-b594-95865bbb939b-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "a2212ea3-6e6f-4f45-b594-95865bbb939b" (UID: "a2212ea3-6e6f-4f45-b594-95865bbb939b"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.225781 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/a2212ea3-6e6f-4f45-b594-95865bbb939b-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.225793 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a2212ea3-6e6f-4f45-b594-95865bbb939b-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.229373 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "a2212ea3-6e6f-4f45-b594-95865bbb939b" (UID: "a2212ea3-6e6f-4f45-b594-95865bbb939b"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.229679 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "a2212ea3-6e6f-4f45-b594-95865bbb939b" (UID: "a2212ea3-6e6f-4f45-b594-95865bbb939b"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.229798 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "a2212ea3-6e6f-4f45-b594-95865bbb939b" (UID: "a2212ea3-6e6f-4f45-b594-95865bbb939b"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.229890 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "a2212ea3-6e6f-4f45-b594-95865bbb939b" (UID: "a2212ea3-6e6f-4f45-b594-95865bbb939b"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.230004 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "a2212ea3-6e6f-4f45-b594-95865bbb939b" (UID: "a2212ea3-6e6f-4f45-b594-95865bbb939b"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.230148 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "a2212ea3-6e6f-4f45-b594-95865bbb939b" (UID: "a2212ea3-6e6f-4f45-b594-95865bbb939b"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.233252 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2212ea3-6e6f-4f45-b594-95865bbb939b-builder-dockercfg-7plhd-push" (OuterVolumeSpecName: "builder-dockercfg-7plhd-push") pod "a2212ea3-6e6f-4f45-b594-95865bbb939b" (UID: "a2212ea3-6e6f-4f45-b594-95865bbb939b"). InnerVolumeSpecName "builder-dockercfg-7plhd-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.233750 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2212ea3-6e6f-4f45-b594-95865bbb939b-builder-dockercfg-7plhd-pull" (OuterVolumeSpecName: "builder-dockercfg-7plhd-pull") pod "a2212ea3-6e6f-4f45-b594-95865bbb939b" (UID: "a2212ea3-6e6f-4f45-b594-95865bbb939b"). InnerVolumeSpecName "builder-dockercfg-7plhd-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.233803 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2212ea3-6e6f-4f45-b594-95865bbb939b-kube-api-access-wcd2m" (OuterVolumeSpecName: "kube-api-access-wcd2m") pod "a2212ea3-6e6f-4f45-b594-95865bbb939b" (UID: "a2212ea3-6e6f-4f45-b594-95865bbb939b"). InnerVolumeSpecName "kube-api-access-wcd2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.235741 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "a2212ea3-6e6f-4f45-b594-95865bbb939b" (UID: "a2212ea3-6e6f-4f45-b594-95865bbb939b"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.326383 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcd2m\" (UniqueName: \"kubernetes.io/projected/a2212ea3-6e6f-4f45-b594-95865bbb939b-kube-api-access-wcd2m\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.326422 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/a2212ea3-6e6f-4f45-b594-95865bbb939b-builder-dockercfg-7plhd-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.326436 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.326448 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.326462 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.326474 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.326486 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.326498 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/a2212ea3-6e6f-4f45-b594-95865bbb939b-builder-dockercfg-7plhd-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.326509 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a2212ea3-6e6f-4f45-b594-95865bbb939b-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.326521 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/a2212ea3-6e6f-4f45-b594-95865bbb939b-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.886293 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"a2212ea3-6e6f-4f45-b594-95865bbb939b","Type":"ContainerDied","Data":"3587bd1dab9e5ec780ec4e3356d26a6f65583b8d116cc504b3d521c0f193fd3c"} Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.886350 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3587bd1dab9e5ec780ec4e3356d26a6f65583b8d116cc504b3d521c0f193fd3c" Mar 10 00:30:43 crc kubenswrapper[4906]: I0310 00:30:43.886405 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.502965 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.503690 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.604242 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 10 00:31:00 crc kubenswrapper[4906]: E0310 00:31:00.604608 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2212ea3-6e6f-4f45-b594-95865bbb939b" containerName="manage-dockerfile" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.604632 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2212ea3-6e6f-4f45-b594-95865bbb939b" containerName="manage-dockerfile" Mar 10 00:31:00 crc kubenswrapper[4906]: E0310 00:31:00.604701 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2212ea3-6e6f-4f45-b594-95865bbb939b" containerName="git-clone" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.604714 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2212ea3-6e6f-4f45-b594-95865bbb939b" containerName="git-clone" Mar 10 00:31:00 crc kubenswrapper[4906]: E0310 00:31:00.604729 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2212ea3-6e6f-4f45-b594-95865bbb939b" containerName="docker-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.604740 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2212ea3-6e6f-4f45-b594-95865bbb939b" containerName="docker-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.604914 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2212ea3-6e6f-4f45-b594-95865bbb939b" containerName="docker-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.606306 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.609349 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.610069 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-sys-config" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.610491 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-ca" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.610969 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-global-ca" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.611704 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-7plhd" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.625294 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.681448 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.681502 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.681536 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp7lw\" (UniqueName: \"kubernetes.io/projected/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-kube-api-access-mp7lw\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.681626 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.681675 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.681693 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.681712 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-builder-dockercfg-7plhd-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.681735 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.681860 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.681955 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-builder-dockercfg-7plhd-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.682036 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.682075 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.682098 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.783469 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.783540 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-builder-dockercfg-7plhd-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.783585 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.783615 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.783671 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.783740 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.783767 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.783796 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp7lw\" (UniqueName: \"kubernetes.io/projected/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-kube-api-access-mp7lw\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.783829 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.783851 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.783873 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.783897 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-builder-dockercfg-7plhd-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.783919 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.784339 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.784414 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.784747 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.785015 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.785492 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.786130 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.786184 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.786360 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.786649 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.791707 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.791715 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-builder-dockercfg-7plhd-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.799499 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-builder-dockercfg-7plhd-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.813111 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp7lw\" (UniqueName: \"kubernetes.io/projected/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-kube-api-access-mp7lw\") pod \"service-telemetry-framework-index-1-build\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:00 crc kubenswrapper[4906]: I0310 00:31:00.933941 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:01 crc kubenswrapper[4906]: I0310 00:31:01.157893 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 10 00:31:02 crc kubenswrapper[4906]: I0310 00:31:02.023016 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8","Type":"ContainerStarted","Data":"4a9424ce3d835ab529599bc7d09217ee5a12e46be47e9b36ec8295b7178571da"} Mar 10 00:31:02 crc kubenswrapper[4906]: I0310 00:31:02.023344 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8","Type":"ContainerStarted","Data":"62617535657c590c9bbd565410cfca9c3566e3be40679024980a20d1b4ffdcbf"} Mar 10 00:31:03 crc kubenswrapper[4906]: I0310 00:31:03.038315 4906 generic.go:334] "Generic (PLEG): container finished" podID="6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" containerID="4a9424ce3d835ab529599bc7d09217ee5a12e46be47e9b36ec8295b7178571da" exitCode=0 Mar 10 00:31:03 crc kubenswrapper[4906]: I0310 00:31:03.038417 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8","Type":"ContainerDied","Data":"4a9424ce3d835ab529599bc7d09217ee5a12e46be47e9b36ec8295b7178571da"} Mar 10 00:31:04 crc kubenswrapper[4906]: I0310 00:31:04.049170 4906 generic.go:334] "Generic (PLEG): container finished" podID="6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" containerID="5740ad97741a3749d5a1f18fdc07faaac39b98f6bceea2fb885b483540157324" exitCode=0 Mar 10 00:31:04 crc kubenswrapper[4906]: I0310 00:31:04.049228 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8","Type":"ContainerDied","Data":"5740ad97741a3749d5a1f18fdc07faaac39b98f6bceea2fb885b483540157324"} Mar 10 00:31:04 crc kubenswrapper[4906]: I0310 00:31:04.082287 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_6f86320d-c5e5-4ca7-a8be-c8a2683e76c8/manage-dockerfile/0.log" Mar 10 00:31:05 crc kubenswrapper[4906]: I0310 00:31:05.061036 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8","Type":"ContainerStarted","Data":"57da001186ee2d2979d67e7bb2ae9acb4aa2df5e9c230ef287675fbb4bde3b16"} Mar 10 00:31:05 crc kubenswrapper[4906]: I0310 00:31:05.101167 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-index-1-build" podStartSLOduration=5.101145268 podStartE2EDuration="5.101145268s" podCreationTimestamp="2026-03-10 00:31:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:31:05.100628663 +0000 UTC m=+1491.248523795" watchObservedRunningTime="2026-03-10 00:31:05.101145268 +0000 UTC m=+1491.249040390" Mar 10 00:31:30 crc kubenswrapper[4906]: I0310 00:31:30.502954 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:31:30 crc kubenswrapper[4906]: I0310 00:31:30.503727 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:31:34 crc kubenswrapper[4906]: I0310 00:31:34.408654 4906 generic.go:334] "Generic (PLEG): container finished" podID="6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" containerID="57da001186ee2d2979d67e7bb2ae9acb4aa2df5e9c230ef287675fbb4bde3b16" exitCode=0 Mar 10 00:31:34 crc kubenswrapper[4906]: I0310 00:31:34.408942 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8","Type":"ContainerDied","Data":"57da001186ee2d2979d67e7bb2ae9acb4aa2df5e9c230ef287675fbb4bde3b16"} Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.676451 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.771174 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-node-pullsecrets\") pod \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.771244 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-container-storage-root\") pod \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.771270 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-blob-cache\") pod \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.771291 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-proxy-ca-bundles\") pod \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.771314 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-buildcachedir\") pod \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.771318 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" (UID: "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.771334 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-system-configs\") pod \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.771434 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-builder-dockercfg-7plhd-push\") pod \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.771494 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.771525 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-container-storage-run\") pod \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.771545 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp7lw\" (UniqueName: \"kubernetes.io/projected/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-kube-api-access-mp7lw\") pod \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.771570 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-ca-bundles\") pod \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.771588 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-buildworkdir\") pod \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.771608 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-builder-dockercfg-7plhd-pull\") pod \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\" (UID: \"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8\") " Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.771965 4906 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.772740 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" (UID: "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.772831 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" (UID: "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.773421 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" (UID: "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.773635 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" (UID: "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.774128 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" (UID: "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.774594 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" (UID: "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.778381 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-builder-dockercfg-7plhd-pull" (OuterVolumeSpecName: "builder-dockercfg-7plhd-pull") pod "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" (UID: "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8"). InnerVolumeSpecName "builder-dockercfg-7plhd-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.779947 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-kube-api-access-mp7lw" (OuterVolumeSpecName: "kube-api-access-mp7lw") pod "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" (UID: "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8"). InnerVolumeSpecName "kube-api-access-mp7lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.782681 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" (UID: "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.792261 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-builder-dockercfg-7plhd-push" (OuterVolumeSpecName: "builder-dockercfg-7plhd-push") pod "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" (UID: "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8"). InnerVolumeSpecName "builder-dockercfg-7plhd-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.872716 4906 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.872745 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.872755 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp7lw\" (UniqueName: \"kubernetes.io/projected/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-kube-api-access-mp7lw\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.872765 4906 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.872774 4906 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.872783 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-pull\" (UniqueName: \"kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-builder-dockercfg-7plhd-pull\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.872794 4906 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.872803 4906 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.872811 4906 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.872820 4906 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-7plhd-push\" (UniqueName: \"kubernetes.io/secret/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-builder-dockercfg-7plhd-push\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:35 crc kubenswrapper[4906]: I0310 00:31:35.999253 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" (UID: "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:36 crc kubenswrapper[4906]: I0310 00:31:36.076318 4906 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:36 crc kubenswrapper[4906]: I0310 00:31:36.428043 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"6f86320d-c5e5-4ca7-a8be-c8a2683e76c8","Type":"ContainerDied","Data":"62617535657c590c9bbd565410cfca9c3566e3be40679024980a20d1b4ffdcbf"} Mar 10 00:31:36 crc kubenswrapper[4906]: I0310 00:31:36.428119 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62617535657c590c9bbd565410cfca9c3566e3be40679024980a20d1b4ffdcbf" Mar 10 00:31:36 crc kubenswrapper[4906]: I0310 00:31:36.428141 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 10 00:31:38 crc kubenswrapper[4906]: I0310 00:31:38.085978 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-np26s"] Mar 10 00:31:38 crc kubenswrapper[4906]: E0310 00:31:38.086486 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" containerName="manage-dockerfile" Mar 10 00:31:38 crc kubenswrapper[4906]: I0310 00:31:38.086501 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" containerName="manage-dockerfile" Mar 10 00:31:38 crc kubenswrapper[4906]: E0310 00:31:38.086521 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" containerName="git-clone" Mar 10 00:31:38 crc kubenswrapper[4906]: I0310 00:31:38.086530 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" containerName="git-clone" Mar 10 00:31:38 crc kubenswrapper[4906]: E0310 00:31:38.086551 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" containerName="docker-build" Mar 10 00:31:38 crc kubenswrapper[4906]: I0310 00:31:38.086559 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" containerName="docker-build" Mar 10 00:31:38 crc kubenswrapper[4906]: I0310 00:31:38.086711 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" containerName="docker-build" Mar 10 00:31:38 crc kubenswrapper[4906]: I0310 00:31:38.087190 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-np26s" Mar 10 00:31:38 crc kubenswrapper[4906]: I0310 00:31:38.090751 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"infrawatch-operators-dockercfg-fxbbf" Mar 10 00:31:38 crc kubenswrapper[4906]: I0310 00:31:38.099115 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-np26s"] Mar 10 00:31:38 crc kubenswrapper[4906]: I0310 00:31:38.105264 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-889qk\" (UniqueName: \"kubernetes.io/projected/127ceebe-2b4c-45c7-b9e9-ac12e25dce54-kube-api-access-889qk\") pod \"infrawatch-operators-np26s\" (UID: \"127ceebe-2b4c-45c7-b9e9-ac12e25dce54\") " pod="service-telemetry/infrawatch-operators-np26s" Mar 10 00:31:38 crc kubenswrapper[4906]: I0310 00:31:38.207516 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-889qk\" (UniqueName: \"kubernetes.io/projected/127ceebe-2b4c-45c7-b9e9-ac12e25dce54-kube-api-access-889qk\") pod \"infrawatch-operators-np26s\" (UID: \"127ceebe-2b4c-45c7-b9e9-ac12e25dce54\") " pod="service-telemetry/infrawatch-operators-np26s" Mar 10 00:31:38 crc kubenswrapper[4906]: I0310 00:31:38.233772 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-889qk\" (UniqueName: \"kubernetes.io/projected/127ceebe-2b4c-45c7-b9e9-ac12e25dce54-kube-api-access-889qk\") pod \"infrawatch-operators-np26s\" (UID: \"127ceebe-2b4c-45c7-b9e9-ac12e25dce54\") " pod="service-telemetry/infrawatch-operators-np26s" Mar 10 00:31:38 crc kubenswrapper[4906]: I0310 00:31:38.422876 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-np26s" Mar 10 00:31:38 crc kubenswrapper[4906]: W0310 00:31:38.655963 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod127ceebe_2b4c_45c7_b9e9_ac12e25dce54.slice/crio-c9dd8e62b32067508f3ed0f265195e3f8ad7460bbc568d9257de3c3d818e3647 WatchSource:0}: Error finding container c9dd8e62b32067508f3ed0f265195e3f8ad7460bbc568d9257de3c3d818e3647: Status 404 returned error can't find the container with id c9dd8e62b32067508f3ed0f265195e3f8ad7460bbc568d9257de3c3d818e3647 Mar 10 00:31:38 crc kubenswrapper[4906]: I0310 00:31:38.686467 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-np26s"] Mar 10 00:31:38 crc kubenswrapper[4906]: I0310 00:31:38.903087 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8" (UID: "6f86320d-c5e5-4ca7-a8be-c8a2683e76c8"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:31:38 crc kubenswrapper[4906]: I0310 00:31:38.918005 4906 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6f86320d-c5e5-4ca7-a8be-c8a2683e76c8-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:39 crc kubenswrapper[4906]: I0310 00:31:39.452212 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-np26s" event={"ID":"127ceebe-2b4c-45c7-b9e9-ac12e25dce54","Type":"ContainerStarted","Data":"c9dd8e62b32067508f3ed0f265195e3f8ad7460bbc568d9257de3c3d818e3647"} Mar 10 00:31:43 crc kubenswrapper[4906]: I0310 00:31:43.069105 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-np26s"] Mar 10 00:31:43 crc kubenswrapper[4906]: I0310 00:31:43.885119 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-tpdm4"] Mar 10 00:31:43 crc kubenswrapper[4906]: I0310 00:31:43.886870 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-tpdm4" Mar 10 00:31:43 crc kubenswrapper[4906]: I0310 00:31:43.888448 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf4pt\" (UniqueName: \"kubernetes.io/projected/c6d58223-a2ba-4b84-913d-80dbc96dff32-kube-api-access-tf4pt\") pod \"infrawatch-operators-tpdm4\" (UID: \"c6d58223-a2ba-4b84-913d-80dbc96dff32\") " pod="service-telemetry/infrawatch-operators-tpdm4" Mar 10 00:31:43 crc kubenswrapper[4906]: I0310 00:31:43.894016 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-tpdm4"] Mar 10 00:31:43 crc kubenswrapper[4906]: I0310 00:31:43.989505 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf4pt\" (UniqueName: \"kubernetes.io/projected/c6d58223-a2ba-4b84-913d-80dbc96dff32-kube-api-access-tf4pt\") pod \"infrawatch-operators-tpdm4\" (UID: \"c6d58223-a2ba-4b84-913d-80dbc96dff32\") " pod="service-telemetry/infrawatch-operators-tpdm4" Mar 10 00:31:44 crc kubenswrapper[4906]: I0310 00:31:44.011210 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf4pt\" (UniqueName: \"kubernetes.io/projected/c6d58223-a2ba-4b84-913d-80dbc96dff32-kube-api-access-tf4pt\") pod \"infrawatch-operators-tpdm4\" (UID: \"c6d58223-a2ba-4b84-913d-80dbc96dff32\") " pod="service-telemetry/infrawatch-operators-tpdm4" Mar 10 00:31:44 crc kubenswrapper[4906]: I0310 00:31:44.220502 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-tpdm4" Mar 10 00:31:49 crc kubenswrapper[4906]: I0310 00:31:49.354774 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-tpdm4"] Mar 10 00:31:49 crc kubenswrapper[4906]: I0310 00:31:49.541684 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-tpdm4" event={"ID":"c6d58223-a2ba-4b84-913d-80dbc96dff32","Type":"ContainerStarted","Data":"88c2287411985ae151f2961408088db839fef795732590043ea6edf9b0e5621f"} Mar 10 00:31:50 crc kubenswrapper[4906]: I0310 00:31:50.550719 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-tpdm4" event={"ID":"c6d58223-a2ba-4b84-913d-80dbc96dff32","Type":"ContainerStarted","Data":"916221ee0f2e8c52553227bb1ccc7d57eb023ac188c5ef03651d60259c082021"} Mar 10 00:31:50 crc kubenswrapper[4906]: I0310 00:31:50.552911 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-np26s" event={"ID":"127ceebe-2b4c-45c7-b9e9-ac12e25dce54","Type":"ContainerStarted","Data":"453ca758075b4488bf34c0f98f2159a3d4387816be111a471ddf6eac7c1cff47"} Mar 10 00:31:50 crc kubenswrapper[4906]: I0310 00:31:50.553071 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-np26s" podUID="127ceebe-2b4c-45c7-b9e9-ac12e25dce54" containerName="registry-server" containerID="cri-o://453ca758075b4488bf34c0f98f2159a3d4387816be111a471ddf6eac7c1cff47" gracePeriod=2 Mar 10 00:31:50 crc kubenswrapper[4906]: I0310 00:31:50.581713 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-tpdm4" podStartSLOduration=7.501752667 podStartE2EDuration="7.581694675s" podCreationTimestamp="2026-03-10 00:31:43 +0000 UTC" firstStartedPulling="2026-03-10 00:31:49.376154899 +0000 UTC m=+1535.524050021" lastFinishedPulling="2026-03-10 00:31:49.456096917 +0000 UTC m=+1535.603992029" observedRunningTime="2026-03-10 00:31:50.577840357 +0000 UTC m=+1536.725735549" watchObservedRunningTime="2026-03-10 00:31:50.581694675 +0000 UTC m=+1536.729589797" Mar 10 00:31:50 crc kubenswrapper[4906]: I0310 00:31:50.613258 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-np26s" podStartSLOduration=1.816281036 podStartE2EDuration="12.613232058s" podCreationTimestamp="2026-03-10 00:31:38 +0000 UTC" firstStartedPulling="2026-03-10 00:31:38.662223411 +0000 UTC m=+1524.810118523" lastFinishedPulling="2026-03-10 00:31:49.459174433 +0000 UTC m=+1535.607069545" observedRunningTime="2026-03-10 00:31:50.605504021 +0000 UTC m=+1536.753399193" watchObservedRunningTime="2026-03-10 00:31:50.613232058 +0000 UTC m=+1536.761127190" Mar 10 00:31:50 crc kubenswrapper[4906]: I0310 00:31:50.931471 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-np26s" Mar 10 00:31:51 crc kubenswrapper[4906]: I0310 00:31:51.085951 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-889qk\" (UniqueName: \"kubernetes.io/projected/127ceebe-2b4c-45c7-b9e9-ac12e25dce54-kube-api-access-889qk\") pod \"127ceebe-2b4c-45c7-b9e9-ac12e25dce54\" (UID: \"127ceebe-2b4c-45c7-b9e9-ac12e25dce54\") " Mar 10 00:31:51 crc kubenswrapper[4906]: I0310 00:31:51.095750 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/127ceebe-2b4c-45c7-b9e9-ac12e25dce54-kube-api-access-889qk" (OuterVolumeSpecName: "kube-api-access-889qk") pod "127ceebe-2b4c-45c7-b9e9-ac12e25dce54" (UID: "127ceebe-2b4c-45c7-b9e9-ac12e25dce54"). InnerVolumeSpecName "kube-api-access-889qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:31:51 crc kubenswrapper[4906]: I0310 00:31:51.187926 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-889qk\" (UniqueName: \"kubernetes.io/projected/127ceebe-2b4c-45c7-b9e9-ac12e25dce54-kube-api-access-889qk\") on node \"crc\" DevicePath \"\"" Mar 10 00:31:51 crc kubenswrapper[4906]: I0310 00:31:51.568993 4906 generic.go:334] "Generic (PLEG): container finished" podID="127ceebe-2b4c-45c7-b9e9-ac12e25dce54" containerID="453ca758075b4488bf34c0f98f2159a3d4387816be111a471ddf6eac7c1cff47" exitCode=0 Mar 10 00:31:51 crc kubenswrapper[4906]: I0310 00:31:51.569067 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-np26s" Mar 10 00:31:51 crc kubenswrapper[4906]: I0310 00:31:51.569053 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-np26s" event={"ID":"127ceebe-2b4c-45c7-b9e9-ac12e25dce54","Type":"ContainerDied","Data":"453ca758075b4488bf34c0f98f2159a3d4387816be111a471ddf6eac7c1cff47"} Mar 10 00:31:51 crc kubenswrapper[4906]: I0310 00:31:51.569483 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-np26s" event={"ID":"127ceebe-2b4c-45c7-b9e9-ac12e25dce54","Type":"ContainerDied","Data":"c9dd8e62b32067508f3ed0f265195e3f8ad7460bbc568d9257de3c3d818e3647"} Mar 10 00:31:51 crc kubenswrapper[4906]: I0310 00:31:51.569513 4906 scope.go:117] "RemoveContainer" containerID="453ca758075b4488bf34c0f98f2159a3d4387816be111a471ddf6eac7c1cff47" Mar 10 00:31:51 crc kubenswrapper[4906]: I0310 00:31:51.594119 4906 scope.go:117] "RemoveContainer" containerID="453ca758075b4488bf34c0f98f2159a3d4387816be111a471ddf6eac7c1cff47" Mar 10 00:31:51 crc kubenswrapper[4906]: E0310 00:31:51.595335 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"453ca758075b4488bf34c0f98f2159a3d4387816be111a471ddf6eac7c1cff47\": container with ID starting with 453ca758075b4488bf34c0f98f2159a3d4387816be111a471ddf6eac7c1cff47 not found: ID does not exist" containerID="453ca758075b4488bf34c0f98f2159a3d4387816be111a471ddf6eac7c1cff47" Mar 10 00:31:51 crc kubenswrapper[4906]: I0310 00:31:51.595527 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"453ca758075b4488bf34c0f98f2159a3d4387816be111a471ddf6eac7c1cff47"} err="failed to get container status \"453ca758075b4488bf34c0f98f2159a3d4387816be111a471ddf6eac7c1cff47\": rpc error: code = NotFound desc = could not find container \"453ca758075b4488bf34c0f98f2159a3d4387816be111a471ddf6eac7c1cff47\": container with ID starting with 453ca758075b4488bf34c0f98f2159a3d4387816be111a471ddf6eac7c1cff47 not found: ID does not exist" Mar 10 00:31:51 crc kubenswrapper[4906]: I0310 00:31:51.617235 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-np26s"] Mar 10 00:31:51 crc kubenswrapper[4906]: I0310 00:31:51.624148 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-np26s"] Mar 10 00:31:52 crc kubenswrapper[4906]: I0310 00:31:52.588680 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="127ceebe-2b4c-45c7-b9e9-ac12e25dce54" path="/var/lib/kubelet/pods/127ceebe-2b4c-45c7-b9e9-ac12e25dce54/volumes" Mar 10 00:31:54 crc kubenswrapper[4906]: I0310 00:31:54.220927 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-tpdm4" Mar 10 00:31:54 crc kubenswrapper[4906]: I0310 00:31:54.221010 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-tpdm4" Mar 10 00:31:54 crc kubenswrapper[4906]: I0310 00:31:54.259825 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-tpdm4" Mar 10 00:31:54 crc kubenswrapper[4906]: I0310 00:31:54.635976 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-tpdm4" Mar 10 00:31:58 crc kubenswrapper[4906]: I0310 00:31:58.925079 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6"] Mar 10 00:31:58 crc kubenswrapper[4906]: E0310 00:31:58.925600 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127ceebe-2b4c-45c7-b9e9-ac12e25dce54" containerName="registry-server" Mar 10 00:31:58 crc kubenswrapper[4906]: I0310 00:31:58.925611 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="127ceebe-2b4c-45c7-b9e9-ac12e25dce54" containerName="registry-server" Mar 10 00:31:58 crc kubenswrapper[4906]: I0310 00:31:58.925761 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="127ceebe-2b4c-45c7-b9e9-ac12e25dce54" containerName="registry-server" Mar 10 00:31:58 crc kubenswrapper[4906]: I0310 00:31:58.926547 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" Mar 10 00:31:58 crc kubenswrapper[4906]: I0310 00:31:58.943665 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6"] Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.023678 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5njqc\" (UniqueName: \"kubernetes.io/projected/b36ddc71-b27c-4c64-bd50-e9bd533f8811-kube-api-access-5njqc\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6\" (UID: \"b36ddc71-b27c-4c64-bd50-e9bd533f8811\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.023723 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b36ddc71-b27c-4c64-bd50-e9bd533f8811-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6\" (UID: \"b36ddc71-b27c-4c64-bd50-e9bd533f8811\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.023759 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b36ddc71-b27c-4c64-bd50-e9bd533f8811-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6\" (UID: \"b36ddc71-b27c-4c64-bd50-e9bd533f8811\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.124948 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b36ddc71-b27c-4c64-bd50-e9bd533f8811-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6\" (UID: \"b36ddc71-b27c-4c64-bd50-e9bd533f8811\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.125065 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5njqc\" (UniqueName: \"kubernetes.io/projected/b36ddc71-b27c-4c64-bd50-e9bd533f8811-kube-api-access-5njqc\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6\" (UID: \"b36ddc71-b27c-4c64-bd50-e9bd533f8811\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.125088 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b36ddc71-b27c-4c64-bd50-e9bd533f8811-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6\" (UID: \"b36ddc71-b27c-4c64-bd50-e9bd533f8811\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.125428 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b36ddc71-b27c-4c64-bd50-e9bd533f8811-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6\" (UID: \"b36ddc71-b27c-4c64-bd50-e9bd533f8811\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.125548 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b36ddc71-b27c-4c64-bd50-e9bd533f8811-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6\" (UID: \"b36ddc71-b27c-4c64-bd50-e9bd533f8811\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.143670 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5njqc\" (UniqueName: \"kubernetes.io/projected/b36ddc71-b27c-4c64-bd50-e9bd533f8811-kube-api-access-5njqc\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6\" (UID: \"b36ddc71-b27c-4c64-bd50-e9bd533f8811\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.247660 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.443568 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6"] Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.633666 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" event={"ID":"b36ddc71-b27c-4c64-bd50-e9bd533f8811","Type":"ContainerStarted","Data":"a22dcb539508710615d16bacad3e11886ae10f6ad223961e173ed633963ec794"} Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.633925 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" event={"ID":"b36ddc71-b27c-4c64-bd50-e9bd533f8811","Type":"ContainerStarted","Data":"82c57eaa1f580e83d8e5c2351a5117e643bd18b77ccd4efc485cceed623d603d"} Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.714833 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8"] Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.716370 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.728519 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8"] Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.833293 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b318737-9871-4d06-926c-4870362e1d3f-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8\" (UID: \"2b318737-9871-4d06-926c-4870362e1d3f\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.833381 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b318737-9871-4d06-926c-4870362e1d3f-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8\" (UID: \"2b318737-9871-4d06-926c-4870362e1d3f\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.833432 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4tqw\" (UniqueName: \"kubernetes.io/projected/2b318737-9871-4d06-926c-4870362e1d3f-kube-api-access-w4tqw\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8\" (UID: \"2b318737-9871-4d06-926c-4870362e1d3f\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.934729 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b318737-9871-4d06-926c-4870362e1d3f-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8\" (UID: \"2b318737-9871-4d06-926c-4870362e1d3f\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.934792 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4tqw\" (UniqueName: \"kubernetes.io/projected/2b318737-9871-4d06-926c-4870362e1d3f-kube-api-access-w4tqw\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8\" (UID: \"2b318737-9871-4d06-926c-4870362e1d3f\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.934873 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b318737-9871-4d06-926c-4870362e1d3f-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8\" (UID: \"2b318737-9871-4d06-926c-4870362e1d3f\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.935613 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b318737-9871-4d06-926c-4870362e1d3f-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8\" (UID: \"2b318737-9871-4d06-926c-4870362e1d3f\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.935629 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b318737-9871-4d06-926c-4870362e1d3f-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8\" (UID: \"2b318737-9871-4d06-926c-4870362e1d3f\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" Mar 10 00:31:59 crc kubenswrapper[4906]: I0310 00:31:59.956463 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4tqw\" (UniqueName: \"kubernetes.io/projected/2b318737-9871-4d06-926c-4870362e1d3f-kube-api-access-w4tqw\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8\" (UID: \"2b318737-9871-4d06-926c-4870362e1d3f\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.038961 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.143454 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551712-m7ff8"] Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.144284 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551712-m7ff8" Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.147059 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ktdr6" Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.147318 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.147448 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.188574 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551712-m7ff8"] Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.238792 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz9m4\" (UniqueName: \"kubernetes.io/projected/3d5bc10c-b23f-448d-90a0-539da38f2f76-kube-api-access-wz9m4\") pod \"auto-csr-approver-29551712-m7ff8\" (UID: \"3d5bc10c-b23f-448d-90a0-539da38f2f76\") " pod="openshift-infra/auto-csr-approver-29551712-m7ff8" Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.340455 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz9m4\" (UniqueName: \"kubernetes.io/projected/3d5bc10c-b23f-448d-90a0-539da38f2f76-kube-api-access-wz9m4\") pod \"auto-csr-approver-29551712-m7ff8\" (UID: \"3d5bc10c-b23f-448d-90a0-539da38f2f76\") " pod="openshift-infra/auto-csr-approver-29551712-m7ff8" Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.367615 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz9m4\" (UniqueName: \"kubernetes.io/projected/3d5bc10c-b23f-448d-90a0-539da38f2f76-kube-api-access-wz9m4\") pod \"auto-csr-approver-29551712-m7ff8\" (UID: \"3d5bc10c-b23f-448d-90a0-539da38f2f76\") " pod="openshift-infra/auto-csr-approver-29551712-m7ff8" Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.485463 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551712-m7ff8" Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.490834 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8"] Mar 10 00:32:00 crc kubenswrapper[4906]: W0310 00:32:00.499547 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b318737_9871_4d06_926c_4870362e1d3f.slice/crio-12c855b699fcdba6a182b07281fcfe5d0b26c251b7c74f7ea4c37c6d2c4df5e7 WatchSource:0}: Error finding container 12c855b699fcdba6a182b07281fcfe5d0b26c251b7c74f7ea4c37c6d2c4df5e7: Status 404 returned error can't find the container with id 12c855b699fcdba6a182b07281fcfe5d0b26c251b7c74f7ea4c37c6d2c4df5e7 Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.502210 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.502264 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.502304 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.502788 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3"} pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.502843 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" containerID="cri-o://3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" gracePeriod=600 Mar 10 00:32:00 crc kubenswrapper[4906]: E0310 00:32:00.637110 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.643950 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" event={"ID":"2b318737-9871-4d06-926c-4870362e1d3f","Type":"ContainerStarted","Data":"12c855b699fcdba6a182b07281fcfe5d0b26c251b7c74f7ea4c37c6d2c4df5e7"} Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.645627 4906 generic.go:334] "Generic (PLEG): container finished" podID="72d61d35-0a64-45a5-8df3-9c429727deba" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" exitCode=0 Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.645702 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerDied","Data":"3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3"} Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.645731 4906 scope.go:117] "RemoveContainer" containerID="6847583fc3b7bdeec69f6786020e94f393a41e01c5039c9b2618c4b51a1b7db5" Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.646325 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:32:00 crc kubenswrapper[4906]: E0310 00:32:00.646541 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.649428 4906 generic.go:334] "Generic (PLEG): container finished" podID="b36ddc71-b27c-4c64-bd50-e9bd533f8811" containerID="a22dcb539508710615d16bacad3e11886ae10f6ad223961e173ed633963ec794" exitCode=0 Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.649468 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" event={"ID":"b36ddc71-b27c-4c64-bd50-e9bd533f8811","Type":"ContainerDied","Data":"a22dcb539508710615d16bacad3e11886ae10f6ad223961e173ed633963ec794"} Mar 10 00:32:00 crc kubenswrapper[4906]: I0310 00:32:00.715165 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551712-m7ff8"] Mar 10 00:32:01 crc kubenswrapper[4906]: I0310 00:32:01.662487 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551712-m7ff8" event={"ID":"3d5bc10c-b23f-448d-90a0-539da38f2f76","Type":"ContainerStarted","Data":"52ff658a2873d4b1f627834edcccddcf5c380521cac539ca7251bdec8462e480"} Mar 10 00:32:01 crc kubenswrapper[4906]: I0310 00:32:01.665986 4906 generic.go:334] "Generic (PLEG): container finished" podID="b36ddc71-b27c-4c64-bd50-e9bd533f8811" containerID="c8712f84131229d8e24db93cc028041a8125dd9bdc5f936e11e47927b2a1b01b" exitCode=0 Mar 10 00:32:01 crc kubenswrapper[4906]: I0310 00:32:01.666030 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" event={"ID":"b36ddc71-b27c-4c64-bd50-e9bd533f8811","Type":"ContainerDied","Data":"c8712f84131229d8e24db93cc028041a8125dd9bdc5f936e11e47927b2a1b01b"} Mar 10 00:32:01 crc kubenswrapper[4906]: I0310 00:32:01.669819 4906 generic.go:334] "Generic (PLEG): container finished" podID="2b318737-9871-4d06-926c-4870362e1d3f" containerID="40f0b64eaac5bea756a3cc7ac9daadc1cb0b01b32947c5505c38cf2759bbaff7" exitCode=0 Mar 10 00:32:01 crc kubenswrapper[4906]: I0310 00:32:01.669855 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" event={"ID":"2b318737-9871-4d06-926c-4870362e1d3f","Type":"ContainerDied","Data":"40f0b64eaac5bea756a3cc7ac9daadc1cb0b01b32947c5505c38cf2759bbaff7"} Mar 10 00:32:02 crc kubenswrapper[4906]: I0310 00:32:02.680566 4906 generic.go:334] "Generic (PLEG): container finished" podID="b36ddc71-b27c-4c64-bd50-e9bd533f8811" containerID="8bf8c6cdc0b85c65f892e6d6a90a2ad1de9ccfc4884fa98359f9ad75e98039d0" exitCode=0 Mar 10 00:32:02 crc kubenswrapper[4906]: I0310 00:32:02.680669 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" event={"ID":"b36ddc71-b27c-4c64-bd50-e9bd533f8811","Type":"ContainerDied","Data":"8bf8c6cdc0b85c65f892e6d6a90a2ad1de9ccfc4884fa98359f9ad75e98039d0"} Mar 10 00:32:02 crc kubenswrapper[4906]: I0310 00:32:02.685336 4906 generic.go:334] "Generic (PLEG): container finished" podID="2b318737-9871-4d06-926c-4870362e1d3f" containerID="27dda9436ffde76374c5137f9648611b11973716c620cea1738987a1275e327f" exitCode=0 Mar 10 00:32:02 crc kubenswrapper[4906]: I0310 00:32:02.685414 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" event={"ID":"2b318737-9871-4d06-926c-4870362e1d3f","Type":"ContainerDied","Data":"27dda9436ffde76374c5137f9648611b11973716c620cea1738987a1275e327f"} Mar 10 00:32:02 crc kubenswrapper[4906]: I0310 00:32:02.688361 4906 generic.go:334] "Generic (PLEG): container finished" podID="3d5bc10c-b23f-448d-90a0-539da38f2f76" containerID="ed64661f222bc2b81849a942b7849b919a743347e94fc2eb75b83c5ac0af93a8" exitCode=0 Mar 10 00:32:02 crc kubenswrapper[4906]: I0310 00:32:02.688398 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551712-m7ff8" event={"ID":"3d5bc10c-b23f-448d-90a0-539da38f2f76","Type":"ContainerDied","Data":"ed64661f222bc2b81849a942b7849b919a743347e94fc2eb75b83c5ac0af93a8"} Mar 10 00:32:03 crc kubenswrapper[4906]: I0310 00:32:03.703550 4906 generic.go:334] "Generic (PLEG): container finished" podID="2b318737-9871-4d06-926c-4870362e1d3f" containerID="03973d00be18a81d4e65076ee9b88fd9d4490029651ae1500e77df8e8bdfa72b" exitCode=0 Mar 10 00:32:03 crc kubenswrapper[4906]: I0310 00:32:03.703622 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" event={"ID":"2b318737-9871-4d06-926c-4870362e1d3f","Type":"ContainerDied","Data":"03973d00be18a81d4e65076ee9b88fd9d4490029651ae1500e77df8e8bdfa72b"} Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.013422 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551712-m7ff8" Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.020507 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.094932 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz9m4\" (UniqueName: \"kubernetes.io/projected/3d5bc10c-b23f-448d-90a0-539da38f2f76-kube-api-access-wz9m4\") pod \"3d5bc10c-b23f-448d-90a0-539da38f2f76\" (UID: \"3d5bc10c-b23f-448d-90a0-539da38f2f76\") " Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.095226 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b36ddc71-b27c-4c64-bd50-e9bd533f8811-bundle\") pod \"b36ddc71-b27c-4c64-bd50-e9bd533f8811\" (UID: \"b36ddc71-b27c-4c64-bd50-e9bd533f8811\") " Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.095414 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b36ddc71-b27c-4c64-bd50-e9bd533f8811-util\") pod \"b36ddc71-b27c-4c64-bd50-e9bd533f8811\" (UID: \"b36ddc71-b27c-4c64-bd50-e9bd533f8811\") " Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.095531 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5njqc\" (UniqueName: \"kubernetes.io/projected/b36ddc71-b27c-4c64-bd50-e9bd533f8811-kube-api-access-5njqc\") pod \"b36ddc71-b27c-4c64-bd50-e9bd533f8811\" (UID: \"b36ddc71-b27c-4c64-bd50-e9bd533f8811\") " Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.096285 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b36ddc71-b27c-4c64-bd50-e9bd533f8811-bundle" (OuterVolumeSpecName: "bundle") pod "b36ddc71-b27c-4c64-bd50-e9bd533f8811" (UID: "b36ddc71-b27c-4c64-bd50-e9bd533f8811"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.102292 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5bc10c-b23f-448d-90a0-539da38f2f76-kube-api-access-wz9m4" (OuterVolumeSpecName: "kube-api-access-wz9m4") pod "3d5bc10c-b23f-448d-90a0-539da38f2f76" (UID: "3d5bc10c-b23f-448d-90a0-539da38f2f76"). InnerVolumeSpecName "kube-api-access-wz9m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.103229 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b36ddc71-b27c-4c64-bd50-e9bd533f8811-kube-api-access-5njqc" (OuterVolumeSpecName: "kube-api-access-5njqc") pod "b36ddc71-b27c-4c64-bd50-e9bd533f8811" (UID: "b36ddc71-b27c-4c64-bd50-e9bd533f8811"). InnerVolumeSpecName "kube-api-access-5njqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.112188 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b36ddc71-b27c-4c64-bd50-e9bd533f8811-util" (OuterVolumeSpecName: "util") pod "b36ddc71-b27c-4c64-bd50-e9bd533f8811" (UID: "b36ddc71-b27c-4c64-bd50-e9bd533f8811"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.198878 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz9m4\" (UniqueName: \"kubernetes.io/projected/3d5bc10c-b23f-448d-90a0-539da38f2f76-kube-api-access-wz9m4\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.199052 4906 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b36ddc71-b27c-4c64-bd50-e9bd533f8811-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.199086 4906 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b36ddc71-b27c-4c64-bd50-e9bd533f8811-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.199165 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5njqc\" (UniqueName: \"kubernetes.io/projected/b36ddc71-b27c-4c64-bd50-e9bd533f8811-kube-api-access-5njqc\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.717016 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.716986 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09n97t6" event={"ID":"b36ddc71-b27c-4c64-bd50-e9bd533f8811","Type":"ContainerDied","Data":"82c57eaa1f580e83d8e5c2351a5117e643bd18b77ccd4efc485cceed623d603d"} Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.717150 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82c57eaa1f580e83d8e5c2351a5117e643bd18b77ccd4efc485cceed623d603d" Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.721210 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551712-m7ff8" Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.721375 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551712-m7ff8" event={"ID":"3d5bc10c-b23f-448d-90a0-539da38f2f76","Type":"ContainerDied","Data":"52ff658a2873d4b1f627834edcccddcf5c380521cac539ca7251bdec8462e480"} Mar 10 00:32:04 crc kubenswrapper[4906]: I0310 00:32:04.721419 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52ff658a2873d4b1f627834edcccddcf5c380521cac539ca7251bdec8462e480" Mar 10 00:32:05 crc kubenswrapper[4906]: I0310 00:32:05.017681 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" Mar 10 00:32:05 crc kubenswrapper[4906]: I0310 00:32:05.092790 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551706-pmx5k"] Mar 10 00:32:05 crc kubenswrapper[4906]: I0310 00:32:05.102074 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551706-pmx5k"] Mar 10 00:32:05 crc kubenswrapper[4906]: I0310 00:32:05.112752 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4tqw\" (UniqueName: \"kubernetes.io/projected/2b318737-9871-4d06-926c-4870362e1d3f-kube-api-access-w4tqw\") pod \"2b318737-9871-4d06-926c-4870362e1d3f\" (UID: \"2b318737-9871-4d06-926c-4870362e1d3f\") " Mar 10 00:32:05 crc kubenswrapper[4906]: I0310 00:32:05.113040 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b318737-9871-4d06-926c-4870362e1d3f-util\") pod \"2b318737-9871-4d06-926c-4870362e1d3f\" (UID: \"2b318737-9871-4d06-926c-4870362e1d3f\") " Mar 10 00:32:05 crc kubenswrapper[4906]: I0310 00:32:05.113138 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b318737-9871-4d06-926c-4870362e1d3f-bundle\") pod \"2b318737-9871-4d06-926c-4870362e1d3f\" (UID: \"2b318737-9871-4d06-926c-4870362e1d3f\") " Mar 10 00:32:05 crc kubenswrapper[4906]: I0310 00:32:05.113848 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b318737-9871-4d06-926c-4870362e1d3f-bundle" (OuterVolumeSpecName: "bundle") pod "2b318737-9871-4d06-926c-4870362e1d3f" (UID: "2b318737-9871-4d06-926c-4870362e1d3f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:05 crc kubenswrapper[4906]: I0310 00:32:05.117797 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b318737-9871-4d06-926c-4870362e1d3f-kube-api-access-w4tqw" (OuterVolumeSpecName: "kube-api-access-w4tqw") pod "2b318737-9871-4d06-926c-4870362e1d3f" (UID: "2b318737-9871-4d06-926c-4870362e1d3f"). InnerVolumeSpecName "kube-api-access-w4tqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:32:05 crc kubenswrapper[4906]: I0310 00:32:05.129062 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b318737-9871-4d06-926c-4870362e1d3f-util" (OuterVolumeSpecName: "util") pod "2b318737-9871-4d06-926c-4870362e1d3f" (UID: "2b318737-9871-4d06-926c-4870362e1d3f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:05 crc kubenswrapper[4906]: I0310 00:32:05.215060 4906 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b318737-9871-4d06-926c-4870362e1d3f-util\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:05 crc kubenswrapper[4906]: I0310 00:32:05.215120 4906 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b318737-9871-4d06-926c-4870362e1d3f-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:05 crc kubenswrapper[4906]: I0310 00:32:05.215140 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4tqw\" (UniqueName: \"kubernetes.io/projected/2b318737-9871-4d06-926c-4870362e1d3f-kube-api-access-w4tqw\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:05 crc kubenswrapper[4906]: I0310 00:32:05.735985 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" event={"ID":"2b318737-9871-4d06-926c-4870362e1d3f","Type":"ContainerDied","Data":"12c855b699fcdba6a182b07281fcfe5d0b26c251b7c74f7ea4c37c6d2c4df5e7"} Mar 10 00:32:05 crc kubenswrapper[4906]: I0310 00:32:05.736057 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12c855b699fcdba6a182b07281fcfe5d0b26c251b7c74f7ea4c37c6d2c4df5e7" Mar 10 00:32:05 crc kubenswrapper[4906]: I0310 00:32:05.736064 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65awnzl8" Mar 10 00:32:06 crc kubenswrapper[4906]: I0310 00:32:06.587887 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a884931-579c-4774-83da-f61f0bf930c8" path="/var/lib/kubelet/pods/3a884931-579c-4774-83da-f61f0bf930c8/volumes" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.772907 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-76f76f667-9zzn2"] Mar 10 00:32:09 crc kubenswrapper[4906]: E0310 00:32:09.773767 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b318737-9871-4d06-926c-4870362e1d3f" containerName="util" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.773783 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b318737-9871-4d06-926c-4870362e1d3f" containerName="util" Mar 10 00:32:09 crc kubenswrapper[4906]: E0310 00:32:09.773794 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b318737-9871-4d06-926c-4870362e1d3f" containerName="extract" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.773802 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b318737-9871-4d06-926c-4870362e1d3f" containerName="extract" Mar 10 00:32:09 crc kubenswrapper[4906]: E0310 00:32:09.773826 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36ddc71-b27c-4c64-bd50-e9bd533f8811" containerName="pull" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.773837 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36ddc71-b27c-4c64-bd50-e9bd533f8811" containerName="pull" Mar 10 00:32:09 crc kubenswrapper[4906]: E0310 00:32:09.773862 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36ddc71-b27c-4c64-bd50-e9bd533f8811" containerName="util" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.773870 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36ddc71-b27c-4c64-bd50-e9bd533f8811" containerName="util" Mar 10 00:32:09 crc kubenswrapper[4906]: E0310 00:32:09.773880 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b318737-9871-4d06-926c-4870362e1d3f" containerName="pull" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.773887 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b318737-9871-4d06-926c-4870362e1d3f" containerName="pull" Mar 10 00:32:09 crc kubenswrapper[4906]: E0310 00:32:09.773903 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36ddc71-b27c-4c64-bd50-e9bd533f8811" containerName="extract" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.773911 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36ddc71-b27c-4c64-bd50-e9bd533f8811" containerName="extract" Mar 10 00:32:09 crc kubenswrapper[4906]: E0310 00:32:09.773921 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5bc10c-b23f-448d-90a0-539da38f2f76" containerName="oc" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.773931 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5bc10c-b23f-448d-90a0-539da38f2f76" containerName="oc" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.774061 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="b36ddc71-b27c-4c64-bd50-e9bd533f8811" containerName="extract" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.774078 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b318737-9871-4d06-926c-4870362e1d3f" containerName="extract" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.774109 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5bc10c-b23f-448d-90a0-539da38f2f76" containerName="oc" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.774851 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-76f76f667-9zzn2" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.777980 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-65thg" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.788505 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-76f76f667-9zzn2"] Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.878310 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv5jr\" (UniqueName: \"kubernetes.io/projected/d2eb5756-28ad-445b-b3f8-1fa0846cd41b-kube-api-access-rv5jr\") pod \"smart-gateway-operator-76f76f667-9zzn2\" (UID: \"d2eb5756-28ad-445b-b3f8-1fa0846cd41b\") " pod="service-telemetry/smart-gateway-operator-76f76f667-9zzn2" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.878539 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/d2eb5756-28ad-445b-b3f8-1fa0846cd41b-runner\") pod \"smart-gateway-operator-76f76f667-9zzn2\" (UID: \"d2eb5756-28ad-445b-b3f8-1fa0846cd41b\") " pod="service-telemetry/smart-gateway-operator-76f76f667-9zzn2" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.980770 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv5jr\" (UniqueName: \"kubernetes.io/projected/d2eb5756-28ad-445b-b3f8-1fa0846cd41b-kube-api-access-rv5jr\") pod \"smart-gateway-operator-76f76f667-9zzn2\" (UID: \"d2eb5756-28ad-445b-b3f8-1fa0846cd41b\") " pod="service-telemetry/smart-gateway-operator-76f76f667-9zzn2" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.980916 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/d2eb5756-28ad-445b-b3f8-1fa0846cd41b-runner\") pod \"smart-gateway-operator-76f76f667-9zzn2\" (UID: \"d2eb5756-28ad-445b-b3f8-1fa0846cd41b\") " pod="service-telemetry/smart-gateway-operator-76f76f667-9zzn2" Mar 10 00:32:09 crc kubenswrapper[4906]: I0310 00:32:09.981507 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/d2eb5756-28ad-445b-b3f8-1fa0846cd41b-runner\") pod \"smart-gateway-operator-76f76f667-9zzn2\" (UID: \"d2eb5756-28ad-445b-b3f8-1fa0846cd41b\") " pod="service-telemetry/smart-gateway-operator-76f76f667-9zzn2" Mar 10 00:32:10 crc kubenswrapper[4906]: I0310 00:32:10.005963 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv5jr\" (UniqueName: \"kubernetes.io/projected/d2eb5756-28ad-445b-b3f8-1fa0846cd41b-kube-api-access-rv5jr\") pod \"smart-gateway-operator-76f76f667-9zzn2\" (UID: \"d2eb5756-28ad-445b-b3f8-1fa0846cd41b\") " pod="service-telemetry/smart-gateway-operator-76f76f667-9zzn2" Mar 10 00:32:10 crc kubenswrapper[4906]: I0310 00:32:10.093186 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-76f76f667-9zzn2" Mar 10 00:32:10 crc kubenswrapper[4906]: I0310 00:32:10.377423 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-76f76f667-9zzn2"] Mar 10 00:32:10 crc kubenswrapper[4906]: W0310 00:32:10.380898 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2eb5756_28ad_445b_b3f8_1fa0846cd41b.slice/crio-39ae396d4c65af413f82a4c3d569ba5899fec5f3b6cf6a192ad9849f97f85894 WatchSource:0}: Error finding container 39ae396d4c65af413f82a4c3d569ba5899fec5f3b6cf6a192ad9849f97f85894: Status 404 returned error can't find the container with id 39ae396d4c65af413f82a4c3d569ba5899fec5f3b6cf6a192ad9849f97f85894 Mar 10 00:32:10 crc kubenswrapper[4906]: I0310 00:32:10.791523 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-76f76f667-9zzn2" event={"ID":"d2eb5756-28ad-445b-b3f8-1fa0846cd41b","Type":"ContainerStarted","Data":"39ae396d4c65af413f82a4c3d569ba5899fec5f3b6cf6a192ad9849f97f85894"} Mar 10 00:32:11 crc kubenswrapper[4906]: I0310 00:32:11.576518 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:32:11 crc kubenswrapper[4906]: E0310 00:32:11.577062 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:32:12 crc kubenswrapper[4906]: I0310 00:32:12.887431 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-6b6c8c999c-7wl2m"] Mar 10 00:32:12 crc kubenswrapper[4906]: I0310 00:32:12.888422 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-6b6c8c999c-7wl2m" Mar 10 00:32:12 crc kubenswrapper[4906]: I0310 00:32:12.906318 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-rsrhx" Mar 10 00:32:12 crc kubenswrapper[4906]: I0310 00:32:12.914836 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-6b6c8c999c-7wl2m"] Mar 10 00:32:13 crc kubenswrapper[4906]: I0310 00:32:13.028176 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsf2w\" (UniqueName: \"kubernetes.io/projected/94d067fb-9889-49b0-b722-ede22e5be330-kube-api-access-vsf2w\") pod \"service-telemetry-operator-6b6c8c999c-7wl2m\" (UID: \"94d067fb-9889-49b0-b722-ede22e5be330\") " pod="service-telemetry/service-telemetry-operator-6b6c8c999c-7wl2m" Mar 10 00:32:13 crc kubenswrapper[4906]: I0310 00:32:13.028242 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/94d067fb-9889-49b0-b722-ede22e5be330-runner\") pod \"service-telemetry-operator-6b6c8c999c-7wl2m\" (UID: \"94d067fb-9889-49b0-b722-ede22e5be330\") " pod="service-telemetry/service-telemetry-operator-6b6c8c999c-7wl2m" Mar 10 00:32:13 crc kubenswrapper[4906]: I0310 00:32:13.129571 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsf2w\" (UniqueName: \"kubernetes.io/projected/94d067fb-9889-49b0-b722-ede22e5be330-kube-api-access-vsf2w\") pod \"service-telemetry-operator-6b6c8c999c-7wl2m\" (UID: \"94d067fb-9889-49b0-b722-ede22e5be330\") " pod="service-telemetry/service-telemetry-operator-6b6c8c999c-7wl2m" Mar 10 00:32:13 crc kubenswrapper[4906]: I0310 00:32:13.129615 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/94d067fb-9889-49b0-b722-ede22e5be330-runner\") pod \"service-telemetry-operator-6b6c8c999c-7wl2m\" (UID: \"94d067fb-9889-49b0-b722-ede22e5be330\") " pod="service-telemetry/service-telemetry-operator-6b6c8c999c-7wl2m" Mar 10 00:32:13 crc kubenswrapper[4906]: I0310 00:32:13.130117 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/94d067fb-9889-49b0-b722-ede22e5be330-runner\") pod \"service-telemetry-operator-6b6c8c999c-7wl2m\" (UID: \"94d067fb-9889-49b0-b722-ede22e5be330\") " pod="service-telemetry/service-telemetry-operator-6b6c8c999c-7wl2m" Mar 10 00:32:13 crc kubenswrapper[4906]: I0310 00:32:13.150555 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsf2w\" (UniqueName: \"kubernetes.io/projected/94d067fb-9889-49b0-b722-ede22e5be330-kube-api-access-vsf2w\") pod \"service-telemetry-operator-6b6c8c999c-7wl2m\" (UID: \"94d067fb-9889-49b0-b722-ede22e5be330\") " pod="service-telemetry/service-telemetry-operator-6b6c8c999c-7wl2m" Mar 10 00:32:13 crc kubenswrapper[4906]: I0310 00:32:13.238417 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-6b6c8c999c-7wl2m" Mar 10 00:32:22 crc kubenswrapper[4906]: I0310 00:32:22.367371 4906 scope.go:117] "RemoveContainer" containerID="c4564953997dd352418213cb088ceb3f283d94aae07ff89e5ce9e44357d4b6d6" Mar 10 00:32:22 crc kubenswrapper[4906]: I0310 00:32:22.867676 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-6b6c8c999c-7wl2m"] Mar 10 00:32:23 crc kubenswrapper[4906]: W0310 00:32:23.426748 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94d067fb_9889_49b0_b722_ede22e5be330.slice/crio-dd3185dd5c9dd998204a3dabf529478e80396a308e18aa5bf614c53d34639a2f WatchSource:0}: Error finding container dd3185dd5c9dd998204a3dabf529478e80396a308e18aa5bf614c53d34639a2f: Status 404 returned error can't find the container with id dd3185dd5c9dd998204a3dabf529478e80396a308e18aa5bf614c53d34639a2f Mar 10 00:32:23 crc kubenswrapper[4906]: I0310 00:32:23.576277 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:32:23 crc kubenswrapper[4906]: E0310 00:32:23.576553 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:32:23 crc kubenswrapper[4906]: I0310 00:32:23.900622 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-6b6c8c999c-7wl2m" event={"ID":"94d067fb-9889-49b0-b722-ede22e5be330","Type":"ContainerStarted","Data":"dd3185dd5c9dd998204a3dabf529478e80396a308e18aa5bf614c53d34639a2f"} Mar 10 00:32:26 crc kubenswrapper[4906]: E0310 00:32:26.779622 4906 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:stable-1.5" Mar 10 00:32:26 crc kubenswrapper[4906]: E0310 00:32:26.780174 4906 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:stable-1.5,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1773102587,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rv5jr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-76f76f667-9zzn2_service-telemetry(d2eb5756-28ad-445b-b3f8-1fa0846cd41b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 00:32:26 crc kubenswrapper[4906]: E0310 00:32:26.782289 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-76f76f667-9zzn2" podUID="d2eb5756-28ad-445b-b3f8-1fa0846cd41b" Mar 10 00:32:26 crc kubenswrapper[4906]: E0310 00:32:26.926740 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:stable-1.5\\\"\"" pod="service-telemetry/smart-gateway-operator-76f76f667-9zzn2" podUID="d2eb5756-28ad-445b-b3f8-1fa0846cd41b" Mar 10 00:32:31 crc kubenswrapper[4906]: I0310 00:32:31.961076 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-6b6c8c999c-7wl2m" event={"ID":"94d067fb-9889-49b0-b722-ede22e5be330","Type":"ContainerStarted","Data":"ec976a5e35a2e541bd872c5bb3f08900ea4c2becdbf6928c4f4fed1f947fc373"} Mar 10 00:32:31 crc kubenswrapper[4906]: I0310 00:32:31.986335 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-6b6c8c999c-7wl2m" podStartSLOduration=11.75856146 podStartE2EDuration="19.986316049s" podCreationTimestamp="2026-03-10 00:32:12 +0000 UTC" firstStartedPulling="2026-03-10 00:32:23.430077054 +0000 UTC m=+1569.577972176" lastFinishedPulling="2026-03-10 00:32:31.657831613 +0000 UTC m=+1577.805726765" observedRunningTime="2026-03-10 00:32:31.977023618 +0000 UTC m=+1578.124918740" watchObservedRunningTime="2026-03-10 00:32:31.986316049 +0000 UTC m=+1578.134211171" Mar 10 00:32:36 crc kubenswrapper[4906]: I0310 00:32:36.576288 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:32:36 crc kubenswrapper[4906]: E0310 00:32:36.577767 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:32:38 crc kubenswrapper[4906]: I0310 00:32:38.480323 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-78rrb"] Mar 10 00:32:38 crc kubenswrapper[4906]: I0310 00:32:38.482174 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78rrb" Mar 10 00:32:38 crc kubenswrapper[4906]: I0310 00:32:38.482680 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgf8n\" (UniqueName: \"kubernetes.io/projected/65844923-9bb8-44b1-8983-6bf8d7ff5d50-kube-api-access-cgf8n\") pod \"community-operators-78rrb\" (UID: \"65844923-9bb8-44b1-8983-6bf8d7ff5d50\") " pod="openshift-marketplace/community-operators-78rrb" Mar 10 00:32:38 crc kubenswrapper[4906]: I0310 00:32:38.482838 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65844923-9bb8-44b1-8983-6bf8d7ff5d50-catalog-content\") pod \"community-operators-78rrb\" (UID: \"65844923-9bb8-44b1-8983-6bf8d7ff5d50\") " pod="openshift-marketplace/community-operators-78rrb" Mar 10 00:32:38 crc kubenswrapper[4906]: I0310 00:32:38.482886 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65844923-9bb8-44b1-8983-6bf8d7ff5d50-utilities\") pod \"community-operators-78rrb\" (UID: \"65844923-9bb8-44b1-8983-6bf8d7ff5d50\") " pod="openshift-marketplace/community-operators-78rrb" Mar 10 00:32:38 crc kubenswrapper[4906]: I0310 00:32:38.495575 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-78rrb"] Mar 10 00:32:38 crc kubenswrapper[4906]: I0310 00:32:38.585058 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgf8n\" (UniqueName: \"kubernetes.io/projected/65844923-9bb8-44b1-8983-6bf8d7ff5d50-kube-api-access-cgf8n\") pod \"community-operators-78rrb\" (UID: \"65844923-9bb8-44b1-8983-6bf8d7ff5d50\") " pod="openshift-marketplace/community-operators-78rrb" Mar 10 00:32:38 crc kubenswrapper[4906]: I0310 00:32:38.585214 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65844923-9bb8-44b1-8983-6bf8d7ff5d50-catalog-content\") pod \"community-operators-78rrb\" (UID: \"65844923-9bb8-44b1-8983-6bf8d7ff5d50\") " pod="openshift-marketplace/community-operators-78rrb" Mar 10 00:32:38 crc kubenswrapper[4906]: I0310 00:32:38.585364 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65844923-9bb8-44b1-8983-6bf8d7ff5d50-utilities\") pod \"community-operators-78rrb\" (UID: \"65844923-9bb8-44b1-8983-6bf8d7ff5d50\") " pod="openshift-marketplace/community-operators-78rrb" Mar 10 00:32:38 crc kubenswrapper[4906]: I0310 00:32:38.585740 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65844923-9bb8-44b1-8983-6bf8d7ff5d50-catalog-content\") pod \"community-operators-78rrb\" (UID: \"65844923-9bb8-44b1-8983-6bf8d7ff5d50\") " pod="openshift-marketplace/community-operators-78rrb" Mar 10 00:32:38 crc kubenswrapper[4906]: I0310 00:32:38.585900 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65844923-9bb8-44b1-8983-6bf8d7ff5d50-utilities\") pod \"community-operators-78rrb\" (UID: \"65844923-9bb8-44b1-8983-6bf8d7ff5d50\") " pod="openshift-marketplace/community-operators-78rrb" Mar 10 00:32:38 crc kubenswrapper[4906]: I0310 00:32:38.604829 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgf8n\" (UniqueName: \"kubernetes.io/projected/65844923-9bb8-44b1-8983-6bf8d7ff5d50-kube-api-access-cgf8n\") pod \"community-operators-78rrb\" (UID: \"65844923-9bb8-44b1-8983-6bf8d7ff5d50\") " pod="openshift-marketplace/community-operators-78rrb" Mar 10 00:32:38 crc kubenswrapper[4906]: I0310 00:32:38.804267 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78rrb" Mar 10 00:32:39 crc kubenswrapper[4906]: I0310 00:32:39.239259 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-78rrb"] Mar 10 00:32:40 crc kubenswrapper[4906]: I0310 00:32:40.020421 4906 generic.go:334] "Generic (PLEG): container finished" podID="65844923-9bb8-44b1-8983-6bf8d7ff5d50" containerID="7c926426bffe37090161021549ab441089c15b430c1d28a35318d466d32a95b6" exitCode=0 Mar 10 00:32:40 crc kubenswrapper[4906]: I0310 00:32:40.020515 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78rrb" event={"ID":"65844923-9bb8-44b1-8983-6bf8d7ff5d50","Type":"ContainerDied","Data":"7c926426bffe37090161021549ab441089c15b430c1d28a35318d466d32a95b6"} Mar 10 00:32:40 crc kubenswrapper[4906]: I0310 00:32:40.020822 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78rrb" event={"ID":"65844923-9bb8-44b1-8983-6bf8d7ff5d50","Type":"ContainerStarted","Data":"d50629336b9053d8d26f0b45faeffb148fdb8cf9051f3b871426c58b4c459e2d"} Mar 10 00:32:41 crc kubenswrapper[4906]: I0310 00:32:41.027900 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78rrb" event={"ID":"65844923-9bb8-44b1-8983-6bf8d7ff5d50","Type":"ContainerStarted","Data":"d9668b84a47893d204ac8df2667b17a2a0c5a9359542ffb26bdad93328704645"} Mar 10 00:32:42 crc kubenswrapper[4906]: I0310 00:32:42.040380 4906 generic.go:334] "Generic (PLEG): container finished" podID="65844923-9bb8-44b1-8983-6bf8d7ff5d50" containerID="d9668b84a47893d204ac8df2667b17a2a0c5a9359542ffb26bdad93328704645" exitCode=0 Mar 10 00:32:42 crc kubenswrapper[4906]: I0310 00:32:42.042153 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78rrb" event={"ID":"65844923-9bb8-44b1-8983-6bf8d7ff5d50","Type":"ContainerDied","Data":"d9668b84a47893d204ac8df2667b17a2a0c5a9359542ffb26bdad93328704645"} Mar 10 00:32:43 crc kubenswrapper[4906]: I0310 00:32:43.049448 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-76f76f667-9zzn2" event={"ID":"d2eb5756-28ad-445b-b3f8-1fa0846cd41b","Type":"ContainerStarted","Data":"c154d1257847c2f8cae97d8db71cb2bdde1f15f6c1b7872944ea9e5c884466b2"} Mar 10 00:32:43 crc kubenswrapper[4906]: I0310 00:32:43.051685 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78rrb" event={"ID":"65844923-9bb8-44b1-8983-6bf8d7ff5d50","Type":"ContainerStarted","Data":"9144ccd08691e039d433ae36aaabc8da71e38b456497fd8169de47d18f15027b"} Mar 10 00:32:43 crc kubenswrapper[4906]: I0310 00:32:43.071576 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-76f76f667-9zzn2" podStartSLOduration=2.402169346 podStartE2EDuration="34.071561091s" podCreationTimestamp="2026-03-10 00:32:09 +0000 UTC" firstStartedPulling="2026-03-10 00:32:10.385459797 +0000 UTC m=+1556.533354929" lastFinishedPulling="2026-03-10 00:32:42.054851532 +0000 UTC m=+1588.202746674" observedRunningTime="2026-03-10 00:32:43.067009313 +0000 UTC m=+1589.214904425" watchObservedRunningTime="2026-03-10 00:32:43.071561091 +0000 UTC m=+1589.219456203" Mar 10 00:32:43 crc kubenswrapper[4906]: I0310 00:32:43.093316 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-78rrb" podStartSLOduration=2.670264362 podStartE2EDuration="5.093297041s" podCreationTimestamp="2026-03-10 00:32:38 +0000 UTC" firstStartedPulling="2026-03-10 00:32:40.022580609 +0000 UTC m=+1586.170475721" lastFinishedPulling="2026-03-10 00:32:42.445613288 +0000 UTC m=+1588.593508400" observedRunningTime="2026-03-10 00:32:43.088045444 +0000 UTC m=+1589.235940546" watchObservedRunningTime="2026-03-10 00:32:43.093297041 +0000 UTC m=+1589.241192153" Mar 10 00:32:47 crc kubenswrapper[4906]: I0310 00:32:47.576103 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:32:47 crc kubenswrapper[4906]: E0310 00:32:47.576705 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:32:48 crc kubenswrapper[4906]: I0310 00:32:48.805357 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-78rrb" Mar 10 00:32:48 crc kubenswrapper[4906]: I0310 00:32:48.805441 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-78rrb" Mar 10 00:32:48 crc kubenswrapper[4906]: I0310 00:32:48.886527 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-78rrb" Mar 10 00:32:49 crc kubenswrapper[4906]: I0310 00:32:49.164737 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-78rrb" Mar 10 00:32:51 crc kubenswrapper[4906]: I0310 00:32:51.069407 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-78rrb"] Mar 10 00:32:51 crc kubenswrapper[4906]: I0310 00:32:51.111502 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-78rrb" podUID="65844923-9bb8-44b1-8983-6bf8d7ff5d50" containerName="registry-server" containerID="cri-o://9144ccd08691e039d433ae36aaabc8da71e38b456497fd8169de47d18f15027b" gracePeriod=2 Mar 10 00:32:51 crc kubenswrapper[4906]: I0310 00:32:51.488799 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78rrb" Mar 10 00:32:51 crc kubenswrapper[4906]: I0310 00:32:51.578774 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65844923-9bb8-44b1-8983-6bf8d7ff5d50-utilities\") pod \"65844923-9bb8-44b1-8983-6bf8d7ff5d50\" (UID: \"65844923-9bb8-44b1-8983-6bf8d7ff5d50\") " Mar 10 00:32:51 crc kubenswrapper[4906]: I0310 00:32:51.578887 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgf8n\" (UniqueName: \"kubernetes.io/projected/65844923-9bb8-44b1-8983-6bf8d7ff5d50-kube-api-access-cgf8n\") pod \"65844923-9bb8-44b1-8983-6bf8d7ff5d50\" (UID: \"65844923-9bb8-44b1-8983-6bf8d7ff5d50\") " Mar 10 00:32:51 crc kubenswrapper[4906]: I0310 00:32:51.578925 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65844923-9bb8-44b1-8983-6bf8d7ff5d50-catalog-content\") pod \"65844923-9bb8-44b1-8983-6bf8d7ff5d50\" (UID: \"65844923-9bb8-44b1-8983-6bf8d7ff5d50\") " Mar 10 00:32:51 crc kubenswrapper[4906]: I0310 00:32:51.582469 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65844923-9bb8-44b1-8983-6bf8d7ff5d50-utilities" (OuterVolumeSpecName: "utilities") pod "65844923-9bb8-44b1-8983-6bf8d7ff5d50" (UID: "65844923-9bb8-44b1-8983-6bf8d7ff5d50"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:51 crc kubenswrapper[4906]: I0310 00:32:51.593835 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65844923-9bb8-44b1-8983-6bf8d7ff5d50-kube-api-access-cgf8n" (OuterVolumeSpecName: "kube-api-access-cgf8n") pod "65844923-9bb8-44b1-8983-6bf8d7ff5d50" (UID: "65844923-9bb8-44b1-8983-6bf8d7ff5d50"). InnerVolumeSpecName "kube-api-access-cgf8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:32:51 crc kubenswrapper[4906]: I0310 00:32:51.642038 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65844923-9bb8-44b1-8983-6bf8d7ff5d50-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65844923-9bb8-44b1-8983-6bf8d7ff5d50" (UID: "65844923-9bb8-44b1-8983-6bf8d7ff5d50"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:32:51 crc kubenswrapper[4906]: I0310 00:32:51.679629 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgf8n\" (UniqueName: \"kubernetes.io/projected/65844923-9bb8-44b1-8983-6bf8d7ff5d50-kube-api-access-cgf8n\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:51 crc kubenswrapper[4906]: I0310 00:32:51.679918 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65844923-9bb8-44b1-8983-6bf8d7ff5d50-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:51 crc kubenswrapper[4906]: I0310 00:32:51.679927 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65844923-9bb8-44b1-8983-6bf8d7ff5d50-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:32:52 crc kubenswrapper[4906]: I0310 00:32:52.127611 4906 generic.go:334] "Generic (PLEG): container finished" podID="65844923-9bb8-44b1-8983-6bf8d7ff5d50" containerID="9144ccd08691e039d433ae36aaabc8da71e38b456497fd8169de47d18f15027b" exitCode=0 Mar 10 00:32:52 crc kubenswrapper[4906]: I0310 00:32:52.127674 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78rrb" event={"ID":"65844923-9bb8-44b1-8983-6bf8d7ff5d50","Type":"ContainerDied","Data":"9144ccd08691e039d433ae36aaabc8da71e38b456497fd8169de47d18f15027b"} Mar 10 00:32:52 crc kubenswrapper[4906]: I0310 00:32:52.127702 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-78rrb" event={"ID":"65844923-9bb8-44b1-8983-6bf8d7ff5d50","Type":"ContainerDied","Data":"d50629336b9053d8d26f0b45faeffb148fdb8cf9051f3b871426c58b4c459e2d"} Mar 10 00:32:52 crc kubenswrapper[4906]: I0310 00:32:52.127721 4906 scope.go:117] "RemoveContainer" containerID="9144ccd08691e039d433ae36aaabc8da71e38b456497fd8169de47d18f15027b" Mar 10 00:32:52 crc kubenswrapper[4906]: I0310 00:32:52.127834 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-78rrb" Mar 10 00:32:52 crc kubenswrapper[4906]: I0310 00:32:52.163942 4906 scope.go:117] "RemoveContainer" containerID="d9668b84a47893d204ac8df2667b17a2a0c5a9359542ffb26bdad93328704645" Mar 10 00:32:52 crc kubenswrapper[4906]: I0310 00:32:52.168518 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-78rrb"] Mar 10 00:32:52 crc kubenswrapper[4906]: I0310 00:32:52.175687 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-78rrb"] Mar 10 00:32:52 crc kubenswrapper[4906]: I0310 00:32:52.224881 4906 scope.go:117] "RemoveContainer" containerID="7c926426bffe37090161021549ab441089c15b430c1d28a35318d466d32a95b6" Mar 10 00:32:52 crc kubenswrapper[4906]: I0310 00:32:52.241811 4906 scope.go:117] "RemoveContainer" containerID="9144ccd08691e039d433ae36aaabc8da71e38b456497fd8169de47d18f15027b" Mar 10 00:32:52 crc kubenswrapper[4906]: E0310 00:32:52.243830 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9144ccd08691e039d433ae36aaabc8da71e38b456497fd8169de47d18f15027b\": container with ID starting with 9144ccd08691e039d433ae36aaabc8da71e38b456497fd8169de47d18f15027b not found: ID does not exist" containerID="9144ccd08691e039d433ae36aaabc8da71e38b456497fd8169de47d18f15027b" Mar 10 00:32:52 crc kubenswrapper[4906]: I0310 00:32:52.243894 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9144ccd08691e039d433ae36aaabc8da71e38b456497fd8169de47d18f15027b"} err="failed to get container status \"9144ccd08691e039d433ae36aaabc8da71e38b456497fd8169de47d18f15027b\": rpc error: code = NotFound desc = could not find container \"9144ccd08691e039d433ae36aaabc8da71e38b456497fd8169de47d18f15027b\": container with ID starting with 9144ccd08691e039d433ae36aaabc8da71e38b456497fd8169de47d18f15027b not found: ID does not exist" Mar 10 00:32:52 crc kubenswrapper[4906]: I0310 00:32:52.243921 4906 scope.go:117] "RemoveContainer" containerID="d9668b84a47893d204ac8df2667b17a2a0c5a9359542ffb26bdad93328704645" Mar 10 00:32:52 crc kubenswrapper[4906]: E0310 00:32:52.244285 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9668b84a47893d204ac8df2667b17a2a0c5a9359542ffb26bdad93328704645\": container with ID starting with d9668b84a47893d204ac8df2667b17a2a0c5a9359542ffb26bdad93328704645 not found: ID does not exist" containerID="d9668b84a47893d204ac8df2667b17a2a0c5a9359542ffb26bdad93328704645" Mar 10 00:32:52 crc kubenswrapper[4906]: I0310 00:32:52.244314 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9668b84a47893d204ac8df2667b17a2a0c5a9359542ffb26bdad93328704645"} err="failed to get container status \"d9668b84a47893d204ac8df2667b17a2a0c5a9359542ffb26bdad93328704645\": rpc error: code = NotFound desc = could not find container \"d9668b84a47893d204ac8df2667b17a2a0c5a9359542ffb26bdad93328704645\": container with ID starting with d9668b84a47893d204ac8df2667b17a2a0c5a9359542ffb26bdad93328704645 not found: ID does not exist" Mar 10 00:32:52 crc kubenswrapper[4906]: I0310 00:32:52.244331 4906 scope.go:117] "RemoveContainer" containerID="7c926426bffe37090161021549ab441089c15b430c1d28a35318d466d32a95b6" Mar 10 00:32:52 crc kubenswrapper[4906]: E0310 00:32:52.244729 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c926426bffe37090161021549ab441089c15b430c1d28a35318d466d32a95b6\": container with ID starting with 7c926426bffe37090161021549ab441089c15b430c1d28a35318d466d32a95b6 not found: ID does not exist" containerID="7c926426bffe37090161021549ab441089c15b430c1d28a35318d466d32a95b6" Mar 10 00:32:52 crc kubenswrapper[4906]: I0310 00:32:52.244793 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c926426bffe37090161021549ab441089c15b430c1d28a35318d466d32a95b6"} err="failed to get container status \"7c926426bffe37090161021549ab441089c15b430c1d28a35318d466d32a95b6\": rpc error: code = NotFound desc = could not find container \"7c926426bffe37090161021549ab441089c15b430c1d28a35318d466d32a95b6\": container with ID starting with 7c926426bffe37090161021549ab441089c15b430c1d28a35318d466d32a95b6 not found: ID does not exist" Mar 10 00:32:52 crc kubenswrapper[4906]: I0310 00:32:52.592173 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65844923-9bb8-44b1-8983-6bf8d7ff5d50" path="/var/lib/kubelet/pods/65844923-9bb8-44b1-8983-6bf8d7ff5d50/volumes" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.405381 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-jg6vm"] Mar 10 00:32:58 crc kubenswrapper[4906]: E0310 00:32:58.409196 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65844923-9bb8-44b1-8983-6bf8d7ff5d50" containerName="registry-server" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.409215 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="65844923-9bb8-44b1-8983-6bf8d7ff5d50" containerName="registry-server" Mar 10 00:32:58 crc kubenswrapper[4906]: E0310 00:32:58.409241 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65844923-9bb8-44b1-8983-6bf8d7ff5d50" containerName="extract-content" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.409249 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="65844923-9bb8-44b1-8983-6bf8d7ff5d50" containerName="extract-content" Mar 10 00:32:58 crc kubenswrapper[4906]: E0310 00:32:58.409265 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65844923-9bb8-44b1-8983-6bf8d7ff5d50" containerName="extract-utilities" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.409274 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="65844923-9bb8-44b1-8983-6bf8d7ff5d50" containerName="extract-utilities" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.409392 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="65844923-9bb8-44b1-8983-6bf8d7ff5d50" containerName="registry-server" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.409934 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.412512 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.413069 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.413383 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.413455 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.413595 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.413653 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-rh2fb" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.413690 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.437246 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-jg6vm"] Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.496958 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.497391 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.497478 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/deec19d6-17bf-41d0-bbe9-5578dce6e7be-sasl-config\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.497576 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.497700 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-sasl-users\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.497799 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.497969 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqfb7\" (UniqueName: \"kubernetes.io/projected/deec19d6-17bf-41d0-bbe9-5578dce6e7be-kube-api-access-wqfb7\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.576148 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:32:58 crc kubenswrapper[4906]: E0310 00:32:58.576350 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.598816 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-sasl-users\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.598877 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.598897 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqfb7\" (UniqueName: \"kubernetes.io/projected/deec19d6-17bf-41d0-bbe9-5578dce6e7be-kube-api-access-wqfb7\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.598952 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.598993 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.599008 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/deec19d6-17bf-41d0-bbe9-5578dce6e7be-sasl-config\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.599036 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.600261 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/deec19d6-17bf-41d0-bbe9-5578dce6e7be-sasl-config\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.604446 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.614139 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.614745 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.615232 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.615667 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-sasl-users\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.617843 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqfb7\" (UniqueName: \"kubernetes.io/projected/deec19d6-17bf-41d0-bbe9-5578dce6e7be-kube-api-access-wqfb7\") pod \"default-interconnect-68864d46cb-jg6vm\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:58 crc kubenswrapper[4906]: I0310 00:32:58.758748 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:32:59 crc kubenswrapper[4906]: I0310 00:32:59.752591 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-jg6vm"] Mar 10 00:33:00 crc kubenswrapper[4906]: I0310 00:33:00.189984 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" event={"ID":"deec19d6-17bf-41d0-bbe9-5578dce6e7be","Type":"ContainerStarted","Data":"023856eec11a4bff7a2493ccf96f4af835f9e2a6ba25c265fe47ef5c03709265"} Mar 10 00:33:06 crc kubenswrapper[4906]: I0310 00:33:06.232222 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" event={"ID":"deec19d6-17bf-41d0-bbe9-5578dce6e7be","Type":"ContainerStarted","Data":"d1081e6a240b63f416b332e706337c910c94a358dd3f5ffea8c02e3259b90754"} Mar 10 00:33:06 crc kubenswrapper[4906]: I0310 00:33:06.268268 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" podStartSLOduration=2.022573624 podStartE2EDuration="8.268245997s" podCreationTimestamp="2026-03-10 00:32:58 +0000 UTC" firstStartedPulling="2026-03-10 00:32:59.764161325 +0000 UTC m=+1605.912056437" lastFinishedPulling="2026-03-10 00:33:06.009833698 +0000 UTC m=+1612.157728810" observedRunningTime="2026-03-10 00:33:06.261159078 +0000 UTC m=+1612.409054190" watchObservedRunningTime="2026-03-10 00:33:06.268245997 +0000 UTC m=+1612.416141129" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.136252 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.137905 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.140350 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.140689 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.140818 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-k9qlz" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.141033 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.141156 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.141209 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.141405 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.141433 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.141438 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.141873 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.155156 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.263073 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-config\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.263148 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.263207 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghkbm\" (UniqueName: \"kubernetes.io/projected/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-kube-api-access-ghkbm\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.263257 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-web-config\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.263303 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.263352 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.263469 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.263539 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.263583 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a1d9a277-e547-4383-8861-4a9971ea6f79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1d9a277-e547-4383-8861-4a9971ea6f79\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.263704 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.263868 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-tls-assets\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.263973 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-config-out\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.365041 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a1d9a277-e547-4383-8861-4a9971ea6f79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1d9a277-e547-4383-8861-4a9971ea6f79\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.365112 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.365141 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-tls-assets\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.365158 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-config-out\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.365175 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-config\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.365195 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.365220 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghkbm\" (UniqueName: \"kubernetes.io/projected/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-kube-api-access-ghkbm\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.365251 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-web-config\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.365269 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.365285 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.365303 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.365326 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: E0310 00:33:08.365448 4906 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 10 00:33:08 crc kubenswrapper[4906]: E0310 00:33:08.365494 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-secret-default-prometheus-proxy-tls podName:7a0470ff-1325-4c14-8b7d-93f6ecad34e6 nodeName:}" failed. No retries permitted until 2026-03-10 00:33:08.865478676 +0000 UTC m=+1615.013373788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "7a0470ff-1325-4c14-8b7d-93f6ecad34e6") : secret "default-prometheus-proxy-tls" not found Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.366376 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.366614 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.367283 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.367333 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.371637 4906 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.371814 4906 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a1d9a277-e547-4383-8861-4a9971ea6f79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1d9a277-e547-4383-8861-4a9971ea6f79\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/47587dd86bb2e1f2ac3f7de7089af7a3c84ba1cf55b70544214fb507864cfab7/globalmount\"" pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.373772 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-tls-assets\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.373801 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-config\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.374104 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.386057 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-config-out\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.386951 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-web-config\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.414162 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghkbm\" (UniqueName: \"kubernetes.io/projected/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-kube-api-access-ghkbm\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.432712 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a1d9a277-e547-4383-8861-4a9971ea6f79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1d9a277-e547-4383-8861-4a9971ea6f79\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: I0310 00:33:08.872630 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:08 crc kubenswrapper[4906]: E0310 00:33:08.872853 4906 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 10 00:33:08 crc kubenswrapper[4906]: E0310 00:33:08.872967 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-secret-default-prometheus-proxy-tls podName:7a0470ff-1325-4c14-8b7d-93f6ecad34e6 nodeName:}" failed. No retries permitted until 2026-03-10 00:33:09.87293747 +0000 UTC m=+1616.020832622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "7a0470ff-1325-4c14-8b7d-93f6ecad34e6") : secret "default-prometheus-proxy-tls" not found Mar 10 00:33:09 crc kubenswrapper[4906]: I0310 00:33:09.576824 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:33:09 crc kubenswrapper[4906]: E0310 00:33:09.577295 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:33:09 crc kubenswrapper[4906]: I0310 00:33:09.888828 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:09 crc kubenswrapper[4906]: I0310 00:33:09.892568 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7a0470ff-1325-4c14-8b7d-93f6ecad34e6-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"7a0470ff-1325-4c14-8b7d-93f6ecad34e6\") " pod="service-telemetry/prometheus-default-0" Mar 10 00:33:09 crc kubenswrapper[4906]: I0310 00:33:09.964535 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 10 00:33:10 crc kubenswrapper[4906]: W0310 00:33:10.437352 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a0470ff_1325_4c14_8b7d_93f6ecad34e6.slice/crio-dd0a651f3ddd4bc505488d0871e29e09f4265ed1fe234544551ab5d43908ebb8 WatchSource:0}: Error finding container dd0a651f3ddd4bc505488d0871e29e09f4265ed1fe234544551ab5d43908ebb8: Status 404 returned error can't find the container with id dd0a651f3ddd4bc505488d0871e29e09f4265ed1fe234544551ab5d43908ebb8 Mar 10 00:33:10 crc kubenswrapper[4906]: I0310 00:33:10.440035 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 10 00:33:10 crc kubenswrapper[4906]: I0310 00:33:10.440368 4906 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:33:11 crc kubenswrapper[4906]: I0310 00:33:11.274717 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7a0470ff-1325-4c14-8b7d-93f6ecad34e6","Type":"ContainerStarted","Data":"dd0a651f3ddd4bc505488d0871e29e09f4265ed1fe234544551ab5d43908ebb8"} Mar 10 00:33:15 crc kubenswrapper[4906]: I0310 00:33:15.309241 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7a0470ff-1325-4c14-8b7d-93f6ecad34e6","Type":"ContainerStarted","Data":"4cfe64d761c9797fe65eb81d4ecff82bc5e47cad2dcab5354e91c2568cf3058b"} Mar 10 00:33:17 crc kubenswrapper[4906]: I0310 00:33:17.722734 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-zsbkd"] Mar 10 00:33:17 crc kubenswrapper[4906]: I0310 00:33:17.724529 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-zsbkd" Mar 10 00:33:17 crc kubenswrapper[4906]: I0310 00:33:17.750048 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-zsbkd"] Mar 10 00:33:17 crc kubenswrapper[4906]: I0310 00:33:17.912441 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m86l6\" (UniqueName: \"kubernetes.io/projected/b7a5d692-659c-4912-b290-4a4010f370b6-kube-api-access-m86l6\") pod \"default-snmp-webhook-6856cfb745-zsbkd\" (UID: \"b7a5d692-659c-4912-b290-4a4010f370b6\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-zsbkd" Mar 10 00:33:18 crc kubenswrapper[4906]: I0310 00:33:18.013670 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m86l6\" (UniqueName: \"kubernetes.io/projected/b7a5d692-659c-4912-b290-4a4010f370b6-kube-api-access-m86l6\") pod \"default-snmp-webhook-6856cfb745-zsbkd\" (UID: \"b7a5d692-659c-4912-b290-4a4010f370b6\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-zsbkd" Mar 10 00:33:18 crc kubenswrapper[4906]: I0310 00:33:18.040230 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m86l6\" (UniqueName: \"kubernetes.io/projected/b7a5d692-659c-4912-b290-4a4010f370b6-kube-api-access-m86l6\") pod \"default-snmp-webhook-6856cfb745-zsbkd\" (UID: \"b7a5d692-659c-4912-b290-4a4010f370b6\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-zsbkd" Mar 10 00:33:18 crc kubenswrapper[4906]: I0310 00:33:18.054981 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-zsbkd" Mar 10 00:33:18 crc kubenswrapper[4906]: I0310 00:33:18.470493 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-zsbkd"] Mar 10 00:33:18 crc kubenswrapper[4906]: W0310 00:33:18.482338 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7a5d692_659c_4912_b290_4a4010f370b6.slice/crio-af1ee20772fcbd1eb99303cdc83dae8882173b6f3174ef3561663f6e90664150 WatchSource:0}: Error finding container af1ee20772fcbd1eb99303cdc83dae8882173b6f3174ef3561663f6e90664150: Status 404 returned error can't find the container with id af1ee20772fcbd1eb99303cdc83dae8882173b6f3174ef3561663f6e90664150 Mar 10 00:33:19 crc kubenswrapper[4906]: I0310 00:33:19.341768 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-zsbkd" event={"ID":"b7a5d692-659c-4912-b290-4a4010f370b6","Type":"ContainerStarted","Data":"af1ee20772fcbd1eb99303cdc83dae8882173b6f3174ef3561663f6e90664150"} Mar 10 00:33:20 crc kubenswrapper[4906]: I0310 00:33:20.576251 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:33:20 crc kubenswrapper[4906]: E0310 00:33:20.576796 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.381147 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.384398 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.386714 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.386859 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.387005 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.387370 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-z7wwr" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.388854 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.389031 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.415872 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.570545 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-config-out\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.570615 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-tls-assets\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.570824 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e99558c6-823a-4c88-99a6-9fcda01b2034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e99558c6-823a-4c88-99a6-9fcda01b2034\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.570887 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-config-volume\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.570914 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.570961 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.572035 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-web-config\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.572095 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5hzw\" (UniqueName: \"kubernetes.io/projected/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-kube-api-access-x5hzw\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.572124 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.673412 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e99558c6-823a-4c88-99a6-9fcda01b2034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e99558c6-823a-4c88-99a6-9fcda01b2034\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.673510 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-config-volume\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.673542 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.673607 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.673673 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-web-config\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.673701 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.673761 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5hzw\" (UniqueName: \"kubernetes.io/projected/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-kube-api-access-x5hzw\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.673883 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-config-out\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.673983 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-tls-assets\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: E0310 00:33:21.673891 4906 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 10 00:33:21 crc kubenswrapper[4906]: E0310 00:33:21.674210 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-secret-default-alertmanager-proxy-tls podName:f531aabb-4f0a-49a6-95ed-f5e3c04ca148 nodeName:}" failed. No retries permitted until 2026-03-10 00:33:22.174185152 +0000 UTC m=+1628.322080324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "f531aabb-4f0a-49a6-95ed-f5e3c04ca148") : secret "default-alertmanager-proxy-tls" not found Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.681827 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-config-out\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.682568 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-web-config\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.682974 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-tls-assets\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.684669 4906 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.684708 4906 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e99558c6-823a-4c88-99a6-9fcda01b2034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e99558c6-823a-4c88-99a6-9fcda01b2034\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9c0ace09c7da553a7a23e66e6c2a706d7aaf2fb66576a615255f5740d18adba6/globalmount\"" pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.693313 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5hzw\" (UniqueName: \"kubernetes.io/projected/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-kube-api-access-x5hzw\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.695901 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.696440 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.697362 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-config-volume\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:21 crc kubenswrapper[4906]: I0310 00:33:21.717151 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e99558c6-823a-4c88-99a6-9fcda01b2034\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e99558c6-823a-4c88-99a6-9fcda01b2034\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:22 crc kubenswrapper[4906]: I0310 00:33:22.201428 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:22 crc kubenswrapper[4906]: E0310 00:33:22.201624 4906 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 10 00:33:22 crc kubenswrapper[4906]: E0310 00:33:22.201695 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-secret-default-alertmanager-proxy-tls podName:f531aabb-4f0a-49a6-95ed-f5e3c04ca148 nodeName:}" failed. No retries permitted until 2026-03-10 00:33:23.201677338 +0000 UTC m=+1629.349572450 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "f531aabb-4f0a-49a6-95ed-f5e3c04ca148") : secret "default-alertmanager-proxy-tls" not found Mar 10 00:33:23 crc kubenswrapper[4906]: I0310 00:33:23.214837 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:23 crc kubenswrapper[4906]: E0310 00:33:23.215133 4906 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 10 00:33:23 crc kubenswrapper[4906]: E0310 00:33:23.215301 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-secret-default-alertmanager-proxy-tls podName:f531aabb-4f0a-49a6-95ed-f5e3c04ca148 nodeName:}" failed. No retries permitted until 2026-03-10 00:33:25.215262169 +0000 UTC m=+1631.363157321 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "f531aabb-4f0a-49a6-95ed-f5e3c04ca148") : secret "default-alertmanager-proxy-tls" not found Mar 10 00:33:25 crc kubenswrapper[4906]: I0310 00:33:25.251652 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:25 crc kubenswrapper[4906]: I0310 00:33:25.260542 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f531aabb-4f0a-49a6-95ed-f5e3c04ca148-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f531aabb-4f0a-49a6-95ed-f5e3c04ca148\") " pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:25 crc kubenswrapper[4906]: I0310 00:33:25.322126 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 10 00:33:25 crc kubenswrapper[4906]: I0310 00:33:25.385385 4906 generic.go:334] "Generic (PLEG): container finished" podID="7a0470ff-1325-4c14-8b7d-93f6ecad34e6" containerID="4cfe64d761c9797fe65eb81d4ecff82bc5e47cad2dcab5354e91c2568cf3058b" exitCode=0 Mar 10 00:33:25 crc kubenswrapper[4906]: I0310 00:33:25.385426 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7a0470ff-1325-4c14-8b7d-93f6ecad34e6","Type":"ContainerDied","Data":"4cfe64d761c9797fe65eb81d4ecff82bc5e47cad2dcab5354e91c2568cf3058b"} Mar 10 00:33:27 crc kubenswrapper[4906]: I0310 00:33:27.114501 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 10 00:33:27 crc kubenswrapper[4906]: W0310 00:33:27.262739 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf531aabb_4f0a_49a6_95ed_f5e3c04ca148.slice/crio-453282231f47d96bfccebdc91ebcbbe3cd31bb0403787d9eb21a7f08c3c0424f WatchSource:0}: Error finding container 453282231f47d96bfccebdc91ebcbbe3cd31bb0403787d9eb21a7f08c3c0424f: Status 404 returned error can't find the container with id 453282231f47d96bfccebdc91ebcbbe3cd31bb0403787d9eb21a7f08c3c0424f Mar 10 00:33:27 crc kubenswrapper[4906]: I0310 00:33:27.411695 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-zsbkd" event={"ID":"b7a5d692-659c-4912-b290-4a4010f370b6","Type":"ContainerStarted","Data":"307b13e4bd95a92d82f58c87b4b910e791d6eac6a2d5ff17bf5e09a8e5ecd062"} Mar 10 00:33:27 crc kubenswrapper[4906]: I0310 00:33:27.412729 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f531aabb-4f0a-49a6-95ed-f5e3c04ca148","Type":"ContainerStarted","Data":"453282231f47d96bfccebdc91ebcbbe3cd31bb0403787d9eb21a7f08c3c0424f"} Mar 10 00:33:27 crc kubenswrapper[4906]: I0310 00:33:27.434836 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-zsbkd" podStartSLOduration=2.225318996 podStartE2EDuration="10.4348136s" podCreationTimestamp="2026-03-10 00:33:17 +0000 UTC" firstStartedPulling="2026-03-10 00:33:18.484763225 +0000 UTC m=+1624.632658357" lastFinishedPulling="2026-03-10 00:33:26.694257849 +0000 UTC m=+1632.842152961" observedRunningTime="2026-03-10 00:33:27.428187344 +0000 UTC m=+1633.576082446" watchObservedRunningTime="2026-03-10 00:33:27.4348136 +0000 UTC m=+1633.582708722" Mar 10 00:33:29 crc kubenswrapper[4906]: I0310 00:33:29.428352 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f531aabb-4f0a-49a6-95ed-f5e3c04ca148","Type":"ContainerStarted","Data":"96af70948074fcac7fb6aaf9c8dd9d1bc8f74835be3f72bbec2bc5acb54ab4da"} Mar 10 00:33:32 crc kubenswrapper[4906]: I0310 00:33:32.453908 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7a0470ff-1325-4c14-8b7d-93f6ecad34e6","Type":"ContainerStarted","Data":"9b2c3b135d97ac4e6202967b155eb80247543fd08129ad63e9aaea5d5acf8c93"} Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.467937 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7a0470ff-1325-4c14-8b7d-93f6ecad34e6","Type":"ContainerStarted","Data":"65a25d5c43614191f35bf7905366edef55dcf078509b6135ff6281d80bc02c87"} Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.581902 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:33:34 crc kubenswrapper[4906]: E0310 00:33:34.582125 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.612078 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp"] Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.614002 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.616590 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.616834 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.616970 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.618499 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-kwph7" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.627546 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp"] Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.685031 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/17b2c3a4-f715-4e49-baf6-ec674881557e-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp\" (UID: \"17b2c3a4-f715-4e49-baf6-ec674881557e\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.685121 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/17b2c3a4-f715-4e49-baf6-ec674881557e-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp\" (UID: \"17b2c3a4-f715-4e49-baf6-ec674881557e\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.685143 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8h2t\" (UniqueName: \"kubernetes.io/projected/17b2c3a4-f715-4e49-baf6-ec674881557e-kube-api-access-m8h2t\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp\" (UID: \"17b2c3a4-f715-4e49-baf6-ec674881557e\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.685184 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/17b2c3a4-f715-4e49-baf6-ec674881557e-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp\" (UID: \"17b2c3a4-f715-4e49-baf6-ec674881557e\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.685200 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/17b2c3a4-f715-4e49-baf6-ec674881557e-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp\" (UID: \"17b2c3a4-f715-4e49-baf6-ec674881557e\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.786627 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/17b2c3a4-f715-4e49-baf6-ec674881557e-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp\" (UID: \"17b2c3a4-f715-4e49-baf6-ec674881557e\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.787075 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8h2t\" (UniqueName: \"kubernetes.io/projected/17b2c3a4-f715-4e49-baf6-ec674881557e-kube-api-access-m8h2t\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp\" (UID: \"17b2c3a4-f715-4e49-baf6-ec674881557e\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.787093 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/17b2c3a4-f715-4e49-baf6-ec674881557e-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp\" (UID: \"17b2c3a4-f715-4e49-baf6-ec674881557e\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.787134 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/17b2c3a4-f715-4e49-baf6-ec674881557e-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp\" (UID: \"17b2c3a4-f715-4e49-baf6-ec674881557e\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.787162 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/17b2c3a4-f715-4e49-baf6-ec674881557e-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp\" (UID: \"17b2c3a4-f715-4e49-baf6-ec674881557e\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.787264 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/17b2c3a4-f715-4e49-baf6-ec674881557e-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp\" (UID: \"17b2c3a4-f715-4e49-baf6-ec674881557e\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:34 crc kubenswrapper[4906]: E0310 00:33:34.787415 4906 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 10 00:33:34 crc kubenswrapper[4906]: E0310 00:33:34.787516 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b2c3a4-f715-4e49-baf6-ec674881557e-default-cloud1-coll-meter-proxy-tls podName:17b2c3a4-f715-4e49-baf6-ec674881557e nodeName:}" failed. No retries permitted until 2026-03-10 00:33:35.287495088 +0000 UTC m=+1641.435390200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/17b2c3a4-f715-4e49-baf6-ec674881557e-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" (UID: "17b2c3a4-f715-4e49-baf6-ec674881557e") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.788307 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/17b2c3a4-f715-4e49-baf6-ec674881557e-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp\" (UID: \"17b2c3a4-f715-4e49-baf6-ec674881557e\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.798202 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/17b2c3a4-f715-4e49-baf6-ec674881557e-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp\" (UID: \"17b2c3a4-f715-4e49-baf6-ec674881557e\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:34 crc kubenswrapper[4906]: I0310 00:33:34.808332 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8h2t\" (UniqueName: \"kubernetes.io/projected/17b2c3a4-f715-4e49-baf6-ec674881557e-kube-api-access-m8h2t\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp\" (UID: \"17b2c3a4-f715-4e49-baf6-ec674881557e\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:35 crc kubenswrapper[4906]: I0310 00:33:35.293391 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/17b2c3a4-f715-4e49-baf6-ec674881557e-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp\" (UID: \"17b2c3a4-f715-4e49-baf6-ec674881557e\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:35 crc kubenswrapper[4906]: E0310 00:33:35.293619 4906 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 10 00:33:35 crc kubenswrapper[4906]: E0310 00:33:35.293719 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b2c3a4-f715-4e49-baf6-ec674881557e-default-cloud1-coll-meter-proxy-tls podName:17b2c3a4-f715-4e49-baf6-ec674881557e nodeName:}" failed. No retries permitted until 2026-03-10 00:33:36.293700957 +0000 UTC m=+1642.441596079 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/17b2c3a4-f715-4e49-baf6-ec674881557e-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" (UID: "17b2c3a4-f715-4e49-baf6-ec674881557e") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 10 00:33:35 crc kubenswrapper[4906]: I0310 00:33:35.489235 4906 generic.go:334] "Generic (PLEG): container finished" podID="f531aabb-4f0a-49a6-95ed-f5e3c04ca148" containerID="96af70948074fcac7fb6aaf9c8dd9d1bc8f74835be3f72bbec2bc5acb54ab4da" exitCode=0 Mar 10 00:33:35 crc kubenswrapper[4906]: I0310 00:33:35.489290 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f531aabb-4f0a-49a6-95ed-f5e3c04ca148","Type":"ContainerDied","Data":"96af70948074fcac7fb6aaf9c8dd9d1bc8f74835be3f72bbec2bc5acb54ab4da"} Mar 10 00:33:36 crc kubenswrapper[4906]: I0310 00:33:36.309388 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/17b2c3a4-f715-4e49-baf6-ec674881557e-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp\" (UID: \"17b2c3a4-f715-4e49-baf6-ec674881557e\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:36 crc kubenswrapper[4906]: I0310 00:33:36.313494 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/17b2c3a4-f715-4e49-baf6-ec674881557e-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp\" (UID: \"17b2c3a4-f715-4e49-baf6-ec674881557e\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:36 crc kubenswrapper[4906]: I0310 00:33:36.434006 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" Mar 10 00:33:36 crc kubenswrapper[4906]: I0310 00:33:36.906370 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp"] Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.371904 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm"] Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.373857 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.375732 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.376542 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.383381 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm"] Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.529159 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6626cf94-749f-4f6e-8a4e-9c58410aaa46-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm\" (UID: \"6626cf94-749f-4f6e-8a4e-9c58410aaa46\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.529232 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/6626cf94-749f-4f6e-8a4e-9c58410aaa46-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm\" (UID: \"6626cf94-749f-4f6e-8a4e-9c58410aaa46\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.529322 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/6626cf94-749f-4f6e-8a4e-9c58410aaa46-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm\" (UID: \"6626cf94-749f-4f6e-8a4e-9c58410aaa46\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.529345 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drqfz\" (UniqueName: \"kubernetes.io/projected/6626cf94-749f-4f6e-8a4e-9c58410aaa46-kube-api-access-drqfz\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm\" (UID: \"6626cf94-749f-4f6e-8a4e-9c58410aaa46\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.529387 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/6626cf94-749f-4f6e-8a4e-9c58410aaa46-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm\" (UID: \"6626cf94-749f-4f6e-8a4e-9c58410aaa46\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.630752 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/6626cf94-749f-4f6e-8a4e-9c58410aaa46-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm\" (UID: \"6626cf94-749f-4f6e-8a4e-9c58410aaa46\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.630795 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drqfz\" (UniqueName: \"kubernetes.io/projected/6626cf94-749f-4f6e-8a4e-9c58410aaa46-kube-api-access-drqfz\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm\" (UID: \"6626cf94-749f-4f6e-8a4e-9c58410aaa46\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.630838 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/6626cf94-749f-4f6e-8a4e-9c58410aaa46-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm\" (UID: \"6626cf94-749f-4f6e-8a4e-9c58410aaa46\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.630870 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6626cf94-749f-4f6e-8a4e-9c58410aaa46-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm\" (UID: \"6626cf94-749f-4f6e-8a4e-9c58410aaa46\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.630892 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/6626cf94-749f-4f6e-8a4e-9c58410aaa46-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm\" (UID: \"6626cf94-749f-4f6e-8a4e-9c58410aaa46\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:37 crc kubenswrapper[4906]: E0310 00:33:37.631289 4906 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 10 00:33:37 crc kubenswrapper[4906]: E0310 00:33:37.631373 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6626cf94-749f-4f6e-8a4e-9c58410aaa46-default-cloud1-ceil-meter-proxy-tls podName:6626cf94-749f-4f6e-8a4e-9c58410aaa46 nodeName:}" failed. No retries permitted until 2026-03-10 00:33:38.131353618 +0000 UTC m=+1644.279248730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/6626cf94-749f-4f6e-8a4e-9c58410aaa46-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" (UID: "6626cf94-749f-4f6e-8a4e-9c58410aaa46") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.631804 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/6626cf94-749f-4f6e-8a4e-9c58410aaa46-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm\" (UID: \"6626cf94-749f-4f6e-8a4e-9c58410aaa46\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.631955 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/6626cf94-749f-4f6e-8a4e-9c58410aaa46-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm\" (UID: \"6626cf94-749f-4f6e-8a4e-9c58410aaa46\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.655211 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/6626cf94-749f-4f6e-8a4e-9c58410aaa46-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm\" (UID: \"6626cf94-749f-4f6e-8a4e-9c58410aaa46\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:37 crc kubenswrapper[4906]: I0310 00:33:37.658312 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drqfz\" (UniqueName: \"kubernetes.io/projected/6626cf94-749f-4f6e-8a4e-9c58410aaa46-kube-api-access-drqfz\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm\" (UID: \"6626cf94-749f-4f6e-8a4e-9c58410aaa46\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:38 crc kubenswrapper[4906]: I0310 00:33:38.138024 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6626cf94-749f-4f6e-8a4e-9c58410aaa46-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm\" (UID: \"6626cf94-749f-4f6e-8a4e-9c58410aaa46\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:38 crc kubenswrapper[4906]: E0310 00:33:38.138147 4906 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 10 00:33:38 crc kubenswrapper[4906]: E0310 00:33:38.138463 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6626cf94-749f-4f6e-8a4e-9c58410aaa46-default-cloud1-ceil-meter-proxy-tls podName:6626cf94-749f-4f6e-8a4e-9c58410aaa46 nodeName:}" failed. No retries permitted until 2026-03-10 00:33:39.138446302 +0000 UTC m=+1645.286341414 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/6626cf94-749f-4f6e-8a4e-9c58410aaa46-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" (UID: "6626cf94-749f-4f6e-8a4e-9c58410aaa46") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 10 00:33:39 crc kubenswrapper[4906]: I0310 00:33:39.151411 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6626cf94-749f-4f6e-8a4e-9c58410aaa46-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm\" (UID: \"6626cf94-749f-4f6e-8a4e-9c58410aaa46\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:39 crc kubenswrapper[4906]: I0310 00:33:39.167168 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6626cf94-749f-4f6e-8a4e-9c58410aaa46-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm\" (UID: \"6626cf94-749f-4f6e-8a4e-9c58410aaa46\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:39 crc kubenswrapper[4906]: I0310 00:33:39.193629 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.328822 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z"] Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.330127 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.332664 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.332682 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.342357 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z"] Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.484692 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/216238e4-5dbc-4cc6-838b-d8521159606f-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z\" (UID: \"216238e4-5dbc-4cc6-838b-d8521159606f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.484809 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/216238e4-5dbc-4cc6-838b-d8521159606f-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z\" (UID: \"216238e4-5dbc-4cc6-838b-d8521159606f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.484851 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/216238e4-5dbc-4cc6-838b-d8521159606f-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z\" (UID: \"216238e4-5dbc-4cc6-838b-d8521159606f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.484894 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/216238e4-5dbc-4cc6-838b-d8521159606f-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z\" (UID: \"216238e4-5dbc-4cc6-838b-d8521159606f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.485003 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g94v\" (UniqueName: \"kubernetes.io/projected/216238e4-5dbc-4cc6-838b-d8521159606f-kube-api-access-6g94v\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z\" (UID: \"216238e4-5dbc-4cc6-838b-d8521159606f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.526202 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" event={"ID":"17b2c3a4-f715-4e49-baf6-ec674881557e","Type":"ContainerStarted","Data":"8c2f2dbb025dd1c8b529c5300515aec5584355ba301667c014b7111d261471e3"} Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.586698 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/216238e4-5dbc-4cc6-838b-d8521159606f-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z\" (UID: \"216238e4-5dbc-4cc6-838b-d8521159606f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.586757 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/216238e4-5dbc-4cc6-838b-d8521159606f-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z\" (UID: \"216238e4-5dbc-4cc6-838b-d8521159606f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.586783 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/216238e4-5dbc-4cc6-838b-d8521159606f-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z\" (UID: \"216238e4-5dbc-4cc6-838b-d8521159606f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.586805 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g94v\" (UniqueName: \"kubernetes.io/projected/216238e4-5dbc-4cc6-838b-d8521159606f-kube-api-access-6g94v\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z\" (UID: \"216238e4-5dbc-4cc6-838b-d8521159606f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.586841 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/216238e4-5dbc-4cc6-838b-d8521159606f-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z\" (UID: \"216238e4-5dbc-4cc6-838b-d8521159606f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:40 crc kubenswrapper[4906]: E0310 00:33:40.587014 4906 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 10 00:33:40 crc kubenswrapper[4906]: E0310 00:33:40.587066 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/216238e4-5dbc-4cc6-838b-d8521159606f-default-cloud1-sens-meter-proxy-tls podName:216238e4-5dbc-4cc6-838b-d8521159606f nodeName:}" failed. No retries permitted until 2026-03-10 00:33:41.087048411 +0000 UTC m=+1647.234943513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/216238e4-5dbc-4cc6-838b-d8521159606f-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" (UID: "216238e4-5dbc-4cc6-838b-d8521159606f") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.587371 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/216238e4-5dbc-4cc6-838b-d8521159606f-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z\" (UID: \"216238e4-5dbc-4cc6-838b-d8521159606f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.587631 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/216238e4-5dbc-4cc6-838b-d8521159606f-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z\" (UID: \"216238e4-5dbc-4cc6-838b-d8521159606f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.594439 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/216238e4-5dbc-4cc6-838b-d8521159606f-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z\" (UID: \"216238e4-5dbc-4cc6-838b-d8521159606f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:40 crc kubenswrapper[4906]: I0310 00:33:40.616192 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g94v\" (UniqueName: \"kubernetes.io/projected/216238e4-5dbc-4cc6-838b-d8521159606f-kube-api-access-6g94v\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z\" (UID: \"216238e4-5dbc-4cc6-838b-d8521159606f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:41 crc kubenswrapper[4906]: I0310 00:33:41.094394 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/216238e4-5dbc-4cc6-838b-d8521159606f-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z\" (UID: \"216238e4-5dbc-4cc6-838b-d8521159606f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:41 crc kubenswrapper[4906]: E0310 00:33:41.094604 4906 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 10 00:33:41 crc kubenswrapper[4906]: E0310 00:33:41.094693 4906 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/216238e4-5dbc-4cc6-838b-d8521159606f-default-cloud1-sens-meter-proxy-tls podName:216238e4-5dbc-4cc6-838b-d8521159606f nodeName:}" failed. No retries permitted until 2026-03-10 00:33:42.09467335 +0000 UTC m=+1648.242568462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/216238e4-5dbc-4cc6-838b-d8521159606f-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" (UID: "216238e4-5dbc-4cc6-838b-d8521159606f") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 10 00:33:42 crc kubenswrapper[4906]: I0310 00:33:42.109619 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/216238e4-5dbc-4cc6-838b-d8521159606f-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z\" (UID: \"216238e4-5dbc-4cc6-838b-d8521159606f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:42 crc kubenswrapper[4906]: I0310 00:33:42.121366 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/216238e4-5dbc-4cc6-838b-d8521159606f-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z\" (UID: \"216238e4-5dbc-4cc6-838b-d8521159606f\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:42 crc kubenswrapper[4906]: I0310 00:33:42.152696 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" Mar 10 00:33:42 crc kubenswrapper[4906]: I0310 00:33:42.738114 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm"] Mar 10 00:33:43 crc kubenswrapper[4906]: I0310 00:33:43.551511 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" event={"ID":"6626cf94-749f-4f6e-8a4e-9c58410aaa46","Type":"ContainerStarted","Data":"1aba75c519f31bb30a971b3bd249d527496703b4cbac2ad2c3c8bf4381821166"} Mar 10 00:33:44 crc kubenswrapper[4906]: I0310 00:33:44.113223 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z"] Mar 10 00:33:44 crc kubenswrapper[4906]: W0310 00:33:44.173184 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod216238e4_5dbc_4cc6_838b_d8521159606f.slice/crio-1a8d1734beca9a5d818457f99fcd780ce154592cefd5b964260f7591556157df WatchSource:0}: Error finding container 1a8d1734beca9a5d818457f99fcd780ce154592cefd5b964260f7591556157df: Status 404 returned error can't find the container with id 1a8d1734beca9a5d818457f99fcd780ce154592cefd5b964260f7591556157df Mar 10 00:33:44 crc kubenswrapper[4906]: I0310 00:33:44.559452 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" event={"ID":"6626cf94-749f-4f6e-8a4e-9c58410aaa46","Type":"ContainerStarted","Data":"1605f3caf57c83b723af4acc5c4c0b3e44c348329a69ad3c9a61f1ee4abf8af5"} Mar 10 00:33:44 crc kubenswrapper[4906]: I0310 00:33:44.566542 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f531aabb-4f0a-49a6-95ed-f5e3c04ca148","Type":"ContainerStarted","Data":"1899b7c8c465b5a52865591000554e8877b0fc4e8347d720b4a0fcf00e9824b3"} Mar 10 00:33:44 crc kubenswrapper[4906]: I0310 00:33:44.569915 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7a0470ff-1325-4c14-8b7d-93f6ecad34e6","Type":"ContainerStarted","Data":"af3c7b91d65b537933d63eee0083e2e23d7105d2779c521791b90c9a9e33df1d"} Mar 10 00:33:44 crc kubenswrapper[4906]: I0310 00:33:44.571200 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" event={"ID":"17b2c3a4-f715-4e49-baf6-ec674881557e","Type":"ContainerStarted","Data":"de9de414a4b2306f69be165c274f9128e71708478f02cabf21a27db3d7395f54"} Mar 10 00:33:44 crc kubenswrapper[4906]: I0310 00:33:44.572217 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" event={"ID":"216238e4-5dbc-4cc6-838b-d8521159606f","Type":"ContainerStarted","Data":"1a8d1734beca9a5d818457f99fcd780ce154592cefd5b964260f7591556157df"} Mar 10 00:33:44 crc kubenswrapper[4906]: I0310 00:33:44.616378 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=4.375626137 podStartE2EDuration="37.616358s" podCreationTimestamp="2026-03-10 00:33:07 +0000 UTC" firstStartedPulling="2026-03-10 00:33:10.440038877 +0000 UTC m=+1616.587933999" lastFinishedPulling="2026-03-10 00:33:43.68077075 +0000 UTC m=+1649.828665862" observedRunningTime="2026-03-10 00:33:44.612210123 +0000 UTC m=+1650.760105225" watchObservedRunningTime="2026-03-10 00:33:44.616358 +0000 UTC m=+1650.764253112" Mar 10 00:33:44 crc kubenswrapper[4906]: I0310 00:33:44.965305 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Mar 10 00:33:45 crc kubenswrapper[4906]: I0310 00:33:45.580143 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" event={"ID":"6626cf94-749f-4f6e-8a4e-9c58410aaa46","Type":"ContainerStarted","Data":"ed71d44ce50559a6e814c1b0e2c04008591b34cd1e35e0a5e65636549414665f"} Mar 10 00:33:46 crc kubenswrapper[4906]: I0310 00:33:46.591825 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" event={"ID":"17b2c3a4-f715-4e49-baf6-ec674881557e","Type":"ContainerStarted","Data":"e20f43e0fa1ccc000a74f94283064218801c3f6c7b961d924fc066dab6b48a6f"} Mar 10 00:33:46 crc kubenswrapper[4906]: I0310 00:33:46.593569 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" event={"ID":"216238e4-5dbc-4cc6-838b-d8521159606f","Type":"ContainerStarted","Data":"c29ba1789c969b5ff8935b3c09c06cfb1743c9899c3f5aae689d747be03bf6e6"} Mar 10 00:33:46 crc kubenswrapper[4906]: I0310 00:33:46.595952 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f531aabb-4f0a-49a6-95ed-f5e3c04ca148","Type":"ContainerStarted","Data":"0d2d4241cdfa96870f88c5cf1c0f40e7c5a7d2106862084a9e16fa3eb82ddb23"} Mar 10 00:33:47 crc kubenswrapper[4906]: I0310 00:33:47.913466 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr"] Mar 10 00:33:47 crc kubenswrapper[4906]: I0310 00:33:47.915050 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" Mar 10 00:33:47 crc kubenswrapper[4906]: I0310 00:33:47.917794 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Mar 10 00:33:47 crc kubenswrapper[4906]: I0310 00:33:47.917908 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Mar 10 00:33:47 crc kubenswrapper[4906]: I0310 00:33:47.931887 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr"] Mar 10 00:33:48 crc kubenswrapper[4906]: I0310 00:33:48.034819 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c914b63b-284c-4693-b383-0583ea97faaf-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr\" (UID: \"c914b63b-284c-4693-b383-0583ea97faaf\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" Mar 10 00:33:48 crc kubenswrapper[4906]: I0310 00:33:48.034904 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/c914b63b-284c-4693-b383-0583ea97faaf-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr\" (UID: \"c914b63b-284c-4693-b383-0583ea97faaf\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" Mar 10 00:33:48 crc kubenswrapper[4906]: I0310 00:33:48.034985 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c2wb\" (UniqueName: \"kubernetes.io/projected/c914b63b-284c-4693-b383-0583ea97faaf-kube-api-access-4c2wb\") pod \"default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr\" (UID: \"c914b63b-284c-4693-b383-0583ea97faaf\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" Mar 10 00:33:48 crc kubenswrapper[4906]: I0310 00:33:48.035101 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c914b63b-284c-4693-b383-0583ea97faaf-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr\" (UID: \"c914b63b-284c-4693-b383-0583ea97faaf\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" Mar 10 00:33:48 crc kubenswrapper[4906]: I0310 00:33:48.136700 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c2wb\" (UniqueName: \"kubernetes.io/projected/c914b63b-284c-4693-b383-0583ea97faaf-kube-api-access-4c2wb\") pod \"default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr\" (UID: \"c914b63b-284c-4693-b383-0583ea97faaf\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" Mar 10 00:33:48 crc kubenswrapper[4906]: I0310 00:33:48.136763 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c914b63b-284c-4693-b383-0583ea97faaf-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr\" (UID: \"c914b63b-284c-4693-b383-0583ea97faaf\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" Mar 10 00:33:48 crc kubenswrapper[4906]: I0310 00:33:48.136823 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c914b63b-284c-4693-b383-0583ea97faaf-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr\" (UID: \"c914b63b-284c-4693-b383-0583ea97faaf\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" Mar 10 00:33:48 crc kubenswrapper[4906]: I0310 00:33:48.136857 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/c914b63b-284c-4693-b383-0583ea97faaf-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr\" (UID: \"c914b63b-284c-4693-b383-0583ea97faaf\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" Mar 10 00:33:48 crc kubenswrapper[4906]: I0310 00:33:48.137363 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c914b63b-284c-4693-b383-0583ea97faaf-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr\" (UID: \"c914b63b-284c-4693-b383-0583ea97faaf\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" Mar 10 00:33:48 crc kubenswrapper[4906]: I0310 00:33:48.138874 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c914b63b-284c-4693-b383-0583ea97faaf-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr\" (UID: \"c914b63b-284c-4693-b383-0583ea97faaf\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" Mar 10 00:33:48 crc kubenswrapper[4906]: I0310 00:33:48.157799 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/c914b63b-284c-4693-b383-0583ea97faaf-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr\" (UID: \"c914b63b-284c-4693-b383-0583ea97faaf\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" Mar 10 00:33:48 crc kubenswrapper[4906]: I0310 00:33:48.203479 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c2wb\" (UniqueName: \"kubernetes.io/projected/c914b63b-284c-4693-b383-0583ea97faaf-kube-api-access-4c2wb\") pod \"default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr\" (UID: \"c914b63b-284c-4693-b383-0583ea97faaf\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" Mar 10 00:33:48 crc kubenswrapper[4906]: I0310 00:33:48.256914 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.191341 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46"] Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.194107 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.197484 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.202287 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46"] Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.352473 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/436fc62f-0a9b-46f8-8b63-a2ad9a34dfff-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46\" (UID: \"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.352521 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m42wn\" (UniqueName: \"kubernetes.io/projected/436fc62f-0a9b-46f8-8b63-a2ad9a34dfff-kube-api-access-m42wn\") pod \"default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46\" (UID: \"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.352553 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/436fc62f-0a9b-46f8-8b63-a2ad9a34dfff-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46\" (UID: \"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.352572 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/436fc62f-0a9b-46f8-8b63-a2ad9a34dfff-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46\" (UID: \"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.453849 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/436fc62f-0a9b-46f8-8b63-a2ad9a34dfff-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46\" (UID: \"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.453936 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m42wn\" (UniqueName: \"kubernetes.io/projected/436fc62f-0a9b-46f8-8b63-a2ad9a34dfff-kube-api-access-m42wn\") pod \"default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46\" (UID: \"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.453996 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/436fc62f-0a9b-46f8-8b63-a2ad9a34dfff-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46\" (UID: \"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.454032 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/436fc62f-0a9b-46f8-8b63-a2ad9a34dfff-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46\" (UID: \"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.455514 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/436fc62f-0a9b-46f8-8b63-a2ad9a34dfff-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46\" (UID: \"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.457173 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/436fc62f-0a9b-46f8-8b63-a2ad9a34dfff-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46\" (UID: \"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.461546 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/436fc62f-0a9b-46f8-8b63-a2ad9a34dfff-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46\" (UID: \"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.476025 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m42wn\" (UniqueName: \"kubernetes.io/projected/436fc62f-0a9b-46f8-8b63-a2ad9a34dfff-kube-api-access-m42wn\") pod \"default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46\" (UID: \"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.512269 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" Mar 10 00:33:49 crc kubenswrapper[4906]: I0310 00:33:49.576284 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:33:49 crc kubenswrapper[4906]: E0310 00:33:49.576502 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:33:51 crc kubenswrapper[4906]: I0310 00:33:51.122116 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr"] Mar 10 00:33:51 crc kubenswrapper[4906]: W0310 00:33:51.135981 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc914b63b_284c_4693_b383_0583ea97faaf.slice/crio-0bcecb9011e9b462508eb2e3183f5b646d50a139410dd3d9e1f671e0f562f11c WatchSource:0}: Error finding container 0bcecb9011e9b462508eb2e3183f5b646d50a139410dd3d9e1f671e0f562f11c: Status 404 returned error can't find the container with id 0bcecb9011e9b462508eb2e3183f5b646d50a139410dd3d9e1f671e0f562f11c Mar 10 00:33:51 crc kubenswrapper[4906]: I0310 00:33:51.168806 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46"] Mar 10 00:33:51 crc kubenswrapper[4906]: I0310 00:33:51.626337 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" event={"ID":"c914b63b-284c-4693-b383-0583ea97faaf","Type":"ContainerStarted","Data":"0bcecb9011e9b462508eb2e3183f5b646d50a139410dd3d9e1f671e0f562f11c"} Mar 10 00:33:51 crc kubenswrapper[4906]: I0310 00:33:51.627831 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" event={"ID":"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff","Type":"ContainerStarted","Data":"217ea7d1643df0cfb9a7a189a5574cf50dc08ab443d82391b6ec3c4cf627ad22"} Mar 10 00:33:54 crc kubenswrapper[4906]: I0310 00:33:54.965683 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Mar 10 00:33:55 crc kubenswrapper[4906]: I0310 00:33:55.028398 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Mar 10 00:33:55 crc kubenswrapper[4906]: I0310 00:33:55.732619 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Mar 10 00:33:56 crc kubenswrapper[4906]: I0310 00:33:56.688838 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f531aabb-4f0a-49a6-95ed-f5e3c04ca148","Type":"ContainerStarted","Data":"e9d992ce4e124ac2a8e6b17bae4710ee9ade5bfce79e9385f5d4ba1b15a0c4ec"} Mar 10 00:33:56 crc kubenswrapper[4906]: I0310 00:33:56.690753 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" event={"ID":"c914b63b-284c-4693-b383-0583ea97faaf","Type":"ContainerStarted","Data":"236fefb5e3e18dbd81592cd28095d6890e00cc0ab6868e0d17a5244240188232"} Mar 10 00:33:56 crc kubenswrapper[4906]: I0310 00:33:56.691150 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" event={"ID":"c914b63b-284c-4693-b383-0583ea97faaf","Type":"ContainerStarted","Data":"08066d95d9100c3283f1e3fa939375fd446b5dcf7751a8887667c423e2d050ae"} Mar 10 00:33:56 crc kubenswrapper[4906]: I0310 00:33:56.693768 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" event={"ID":"17b2c3a4-f715-4e49-baf6-ec674881557e","Type":"ContainerStarted","Data":"cc7463ea55e334672f358a33aa6ea3ea4f03c324f48572ae039440fdb4800311"} Mar 10 00:33:56 crc kubenswrapper[4906]: I0310 00:33:56.695871 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" event={"ID":"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff","Type":"ContainerStarted","Data":"c9fba397679d423b4fcb57bb87481d032afc721039745ccfb42ae171c66d2eca"} Mar 10 00:33:56 crc kubenswrapper[4906]: I0310 00:33:56.695896 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" event={"ID":"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff","Type":"ContainerStarted","Data":"dfba1818dbbe533e0fb93624d5f7833b1f05c705b76ba98ba5eb973334db8bcd"} Mar 10 00:33:56 crc kubenswrapper[4906]: I0310 00:33:56.698491 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" event={"ID":"216238e4-5dbc-4cc6-838b-d8521159606f","Type":"ContainerStarted","Data":"13ba4013d5a725254851d839037b08a94be8417756afc5f1a815765d55b7470d"} Mar 10 00:33:56 crc kubenswrapper[4906]: I0310 00:33:56.698545 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" event={"ID":"216238e4-5dbc-4cc6-838b-d8521159606f","Type":"ContainerStarted","Data":"0ecaae6f84964d04b4b0ecf2c1ac53189971bb3753a4112646ecfe07bcf2f922"} Mar 10 00:33:56 crc kubenswrapper[4906]: I0310 00:33:56.701083 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" event={"ID":"6626cf94-749f-4f6e-8a4e-9c58410aaa46","Type":"ContainerStarted","Data":"f71572e533c5da7c16e7175a4ff89f296d5455705ad431b456a93910fe713902"} Mar 10 00:33:56 crc kubenswrapper[4906]: I0310 00:33:56.726521 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=16.649233723000002 podStartE2EDuration="36.726500209s" podCreationTimestamp="2026-03-10 00:33:20 +0000 UTC" firstStartedPulling="2026-03-10 00:33:35.493315404 +0000 UTC m=+1641.641210516" lastFinishedPulling="2026-03-10 00:33:55.57058185 +0000 UTC m=+1661.718477002" observedRunningTime="2026-03-10 00:33:56.725142461 +0000 UTC m=+1662.873037603" watchObservedRunningTime="2026-03-10 00:33:56.726500209 +0000 UTC m=+1662.874395341" Mar 10 00:33:56 crc kubenswrapper[4906]: I0310 00:33:56.746673 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" podStartSLOduration=2.840295231 podStartE2EDuration="7.746655815s" podCreationTimestamp="2026-03-10 00:33:49 +0000 UTC" firstStartedPulling="2026-03-10 00:33:51.188777371 +0000 UTC m=+1657.336672483" lastFinishedPulling="2026-03-10 00:33:56.095137955 +0000 UTC m=+1662.243033067" observedRunningTime="2026-03-10 00:33:56.74648151 +0000 UTC m=+1662.894376622" watchObservedRunningTime="2026-03-10 00:33:56.746655815 +0000 UTC m=+1662.894550927" Mar 10 00:33:56 crc kubenswrapper[4906]: I0310 00:33:56.785327 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" podStartSLOduration=4.873327851 podStartE2EDuration="9.785304151s" podCreationTimestamp="2026-03-10 00:33:47 +0000 UTC" firstStartedPulling="2026-03-10 00:33:51.141432972 +0000 UTC m=+1657.289328084" lastFinishedPulling="2026-03-10 00:33:56.053409272 +0000 UTC m=+1662.201304384" observedRunningTime="2026-03-10 00:33:56.776859163 +0000 UTC m=+1662.924754265" watchObservedRunningTime="2026-03-10 00:33:56.785304151 +0000 UTC m=+1662.933199263" Mar 10 00:33:56 crc kubenswrapper[4906]: I0310 00:33:56.831466 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" podStartSLOduration=5.015543903 podStartE2EDuration="16.831449597s" podCreationTimestamp="2026-03-10 00:33:40 +0000 UTC" firstStartedPulling="2026-03-10 00:33:44.176218047 +0000 UTC m=+1650.324113159" lastFinishedPulling="2026-03-10 00:33:55.992123741 +0000 UTC m=+1662.140018853" observedRunningTime="2026-03-10 00:33:56.809961353 +0000 UTC m=+1662.957856465" watchObservedRunningTime="2026-03-10 00:33:56.831449597 +0000 UTC m=+1662.979344709" Mar 10 00:33:56 crc kubenswrapper[4906]: I0310 00:33:56.852950 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" podStartSLOduration=7.346122773 podStartE2EDuration="22.85293065s" podCreationTimestamp="2026-03-10 00:33:34 +0000 UTC" firstStartedPulling="2026-03-10 00:33:40.189022411 +0000 UTC m=+1646.336917523" lastFinishedPulling="2026-03-10 00:33:55.695830278 +0000 UTC m=+1661.843725400" observedRunningTime="2026-03-10 00:33:56.826904969 +0000 UTC m=+1662.974800081" watchObservedRunningTime="2026-03-10 00:33:56.85293065 +0000 UTC m=+1663.000825762" Mar 10 00:33:56 crc kubenswrapper[4906]: I0310 00:33:56.853468 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" podStartSLOduration=7.621450733 podStartE2EDuration="19.853463305s" podCreationTimestamp="2026-03-10 00:33:37 +0000 UTC" firstStartedPulling="2026-03-10 00:33:43.436256082 +0000 UTC m=+1649.584151194" lastFinishedPulling="2026-03-10 00:33:55.668268644 +0000 UTC m=+1661.816163766" observedRunningTime="2026-03-10 00:33:56.845625565 +0000 UTC m=+1662.993520667" watchObservedRunningTime="2026-03-10 00:33:56.853463305 +0000 UTC m=+1663.001358417" Mar 10 00:34:00 crc kubenswrapper[4906]: I0310 00:34:00.134451 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551714-l284f"] Mar 10 00:34:00 crc kubenswrapper[4906]: I0310 00:34:00.135726 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551714-l284f" Mar 10 00:34:00 crc kubenswrapper[4906]: I0310 00:34:00.141254 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:34:00 crc kubenswrapper[4906]: I0310 00:34:00.143386 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ktdr6" Mar 10 00:34:00 crc kubenswrapper[4906]: I0310 00:34:00.152323 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:34:00 crc kubenswrapper[4906]: I0310 00:34:00.177436 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551714-l284f"] Mar 10 00:34:00 crc kubenswrapper[4906]: I0310 00:34:00.230736 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpzp6\" (UniqueName: \"kubernetes.io/projected/ce0b1165-845d-47b5-b6f8-3d2519066da0-kube-api-access-cpzp6\") pod \"auto-csr-approver-29551714-l284f\" (UID: \"ce0b1165-845d-47b5-b6f8-3d2519066da0\") " pod="openshift-infra/auto-csr-approver-29551714-l284f" Mar 10 00:34:00 crc kubenswrapper[4906]: I0310 00:34:00.331814 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpzp6\" (UniqueName: \"kubernetes.io/projected/ce0b1165-845d-47b5-b6f8-3d2519066da0-kube-api-access-cpzp6\") pod \"auto-csr-approver-29551714-l284f\" (UID: \"ce0b1165-845d-47b5-b6f8-3d2519066da0\") " pod="openshift-infra/auto-csr-approver-29551714-l284f" Mar 10 00:34:00 crc kubenswrapper[4906]: I0310 00:34:00.358396 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpzp6\" (UniqueName: \"kubernetes.io/projected/ce0b1165-845d-47b5-b6f8-3d2519066da0-kube-api-access-cpzp6\") pod \"auto-csr-approver-29551714-l284f\" (UID: \"ce0b1165-845d-47b5-b6f8-3d2519066da0\") " pod="openshift-infra/auto-csr-approver-29551714-l284f" Mar 10 00:34:00 crc kubenswrapper[4906]: I0310 00:34:00.451764 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551714-l284f" Mar 10 00:34:00 crc kubenswrapper[4906]: I0310 00:34:00.577398 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:34:00 crc kubenswrapper[4906]: E0310 00:34:00.577835 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:34:00 crc kubenswrapper[4906]: I0310 00:34:00.796077 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551714-l284f"] Mar 10 00:34:01 crc kubenswrapper[4906]: I0310 00:34:01.741215 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551714-l284f" event={"ID":"ce0b1165-845d-47b5-b6f8-3d2519066da0","Type":"ContainerStarted","Data":"7ac4d0ef97e55b48782d5e5e6f1a4a6bf019d668a63b55e69795080f20a8f4cb"} Mar 10 00:34:02 crc kubenswrapper[4906]: I0310 00:34:02.524502 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-jg6vm"] Mar 10 00:34:02 crc kubenswrapper[4906]: I0310 00:34:02.524992 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" podUID="deec19d6-17bf-41d0-bbe9-5578dce6e7be" containerName="default-interconnect" containerID="cri-o://d1081e6a240b63f416b332e706337c910c94a358dd3f5ffea8c02e3259b90754" gracePeriod=30 Mar 10 00:34:02 crc kubenswrapper[4906]: I0310 00:34:02.753685 4906 generic.go:334] "Generic (PLEG): container finished" podID="deec19d6-17bf-41d0-bbe9-5578dce6e7be" containerID="d1081e6a240b63f416b332e706337c910c94a358dd3f5ffea8c02e3259b90754" exitCode=0 Mar 10 00:34:02 crc kubenswrapper[4906]: I0310 00:34:02.753766 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" event={"ID":"deec19d6-17bf-41d0-bbe9-5578dce6e7be","Type":"ContainerDied","Data":"d1081e6a240b63f416b332e706337c910c94a358dd3f5ffea8c02e3259b90754"} Mar 10 00:34:02 crc kubenswrapper[4906]: I0310 00:34:02.759420 4906 generic.go:334] "Generic (PLEG): container finished" podID="6626cf94-749f-4f6e-8a4e-9c58410aaa46" containerID="ed71d44ce50559a6e814c1b0e2c04008591b34cd1e35e0a5e65636549414665f" exitCode=0 Mar 10 00:34:02 crc kubenswrapper[4906]: I0310 00:34:02.759492 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" event={"ID":"6626cf94-749f-4f6e-8a4e-9c58410aaa46","Type":"ContainerDied","Data":"ed71d44ce50559a6e814c1b0e2c04008591b34cd1e35e0a5e65636549414665f"} Mar 10 00:34:02 crc kubenswrapper[4906]: I0310 00:34:02.760076 4906 scope.go:117] "RemoveContainer" containerID="ed71d44ce50559a6e814c1b0e2c04008591b34cd1e35e0a5e65636549414665f" Mar 10 00:34:02 crc kubenswrapper[4906]: I0310 00:34:02.763288 4906 generic.go:334] "Generic (PLEG): container finished" podID="ce0b1165-845d-47b5-b6f8-3d2519066da0" containerID="2e98eb0d7fb03e6c6ca674cc935091d17b62749141537c4a23c135853bc606d8" exitCode=0 Mar 10 00:34:02 crc kubenswrapper[4906]: I0310 00:34:02.763377 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551714-l284f" event={"ID":"ce0b1165-845d-47b5-b6f8-3d2519066da0","Type":"ContainerDied","Data":"2e98eb0d7fb03e6c6ca674cc935091d17b62749141537c4a23c135853bc606d8"} Mar 10 00:34:02 crc kubenswrapper[4906]: I0310 00:34:02.769359 4906 generic.go:334] "Generic (PLEG): container finished" podID="17b2c3a4-f715-4e49-baf6-ec674881557e" containerID="e20f43e0fa1ccc000a74f94283064218801c3f6c7b961d924fc066dab6b48a6f" exitCode=0 Mar 10 00:34:02 crc kubenswrapper[4906]: I0310 00:34:02.769405 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" event={"ID":"17b2c3a4-f715-4e49-baf6-ec674881557e","Type":"ContainerDied","Data":"e20f43e0fa1ccc000a74f94283064218801c3f6c7b961d924fc066dab6b48a6f"} Mar 10 00:34:02 crc kubenswrapper[4906]: I0310 00:34:02.769791 4906 scope.go:117] "RemoveContainer" containerID="e20f43e0fa1ccc000a74f94283064218801c3f6c7b961d924fc066dab6b48a6f" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.096300 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.171829 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/deec19d6-17bf-41d0-bbe9-5578dce6e7be-sasl-config\") pod \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.171931 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-inter-router-credentials\") pod \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.171959 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-openstack-ca\") pod \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.172005 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-sasl-users\") pod \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.172067 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-openstack-credentials\") pod \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.172094 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-inter-router-ca\") pod \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.172159 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqfb7\" (UniqueName: \"kubernetes.io/projected/deec19d6-17bf-41d0-bbe9-5578dce6e7be-kube-api-access-wqfb7\") pod \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\" (UID: \"deec19d6-17bf-41d0-bbe9-5578dce6e7be\") " Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.173675 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deec19d6-17bf-41d0-bbe9-5578dce6e7be-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "deec19d6-17bf-41d0-bbe9-5578dce6e7be" (UID: "deec19d6-17bf-41d0-bbe9-5578dce6e7be"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.178801 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "deec19d6-17bf-41d0-bbe9-5578dce6e7be" (UID: "deec19d6-17bf-41d0-bbe9-5578dce6e7be"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.180766 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "deec19d6-17bf-41d0-bbe9-5578dce6e7be" (UID: "deec19d6-17bf-41d0-bbe9-5578dce6e7be"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.180937 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "deec19d6-17bf-41d0-bbe9-5578dce6e7be" (UID: "deec19d6-17bf-41d0-bbe9-5578dce6e7be"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.181274 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "deec19d6-17bf-41d0-bbe9-5578dce6e7be" (UID: "deec19d6-17bf-41d0-bbe9-5578dce6e7be"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.186982 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deec19d6-17bf-41d0-bbe9-5578dce6e7be-kube-api-access-wqfb7" (OuterVolumeSpecName: "kube-api-access-wqfb7") pod "deec19d6-17bf-41d0-bbe9-5578dce6e7be" (UID: "deec19d6-17bf-41d0-bbe9-5578dce6e7be"). InnerVolumeSpecName "kube-api-access-wqfb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.187001 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "deec19d6-17bf-41d0-bbe9-5578dce6e7be" (UID: "deec19d6-17bf-41d0-bbe9-5578dce6e7be"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.274250 4906 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.274295 4906 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.274310 4906 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-sasl-users\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.274325 4906 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.274339 4906 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/deec19d6-17bf-41d0-bbe9-5578dce6e7be-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.274352 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqfb7\" (UniqueName: \"kubernetes.io/projected/deec19d6-17bf-41d0-bbe9-5578dce6e7be-kube-api-access-wqfb7\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.274367 4906 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/deec19d6-17bf-41d0-bbe9-5578dce6e7be-sasl-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.784609 4906 generic.go:334] "Generic (PLEG): container finished" podID="216238e4-5dbc-4cc6-838b-d8521159606f" containerID="0ecaae6f84964d04b4b0ecf2c1ac53189971bb3753a4112646ecfe07bcf2f922" exitCode=0 Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.784730 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" event={"ID":"216238e4-5dbc-4cc6-838b-d8521159606f","Type":"ContainerDied","Data":"0ecaae6f84964d04b4b0ecf2c1ac53189971bb3753a4112646ecfe07bcf2f922"} Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.785418 4906 scope.go:117] "RemoveContainer" containerID="0ecaae6f84964d04b4b0ecf2c1ac53189971bb3753a4112646ecfe07bcf2f922" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.792502 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" event={"ID":"6626cf94-749f-4f6e-8a4e-9c58410aaa46","Type":"ContainerStarted","Data":"3148298c908d09353bd22500a067e51fad8cd5aa743467da9c464c0f8232cc5b"} Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.795210 4906 generic.go:334] "Generic (PLEG): container finished" podID="c914b63b-284c-4693-b383-0583ea97faaf" containerID="08066d95d9100c3283f1e3fa939375fd446b5dcf7751a8887667c423e2d050ae" exitCode=0 Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.795263 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" event={"ID":"c914b63b-284c-4693-b383-0583ea97faaf","Type":"ContainerDied","Data":"08066d95d9100c3283f1e3fa939375fd446b5dcf7751a8887667c423e2d050ae"} Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.795528 4906 scope.go:117] "RemoveContainer" containerID="08066d95d9100c3283f1e3fa939375fd446b5dcf7751a8887667c423e2d050ae" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.798178 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" event={"ID":"17b2c3a4-f715-4e49-baf6-ec674881557e","Type":"ContainerStarted","Data":"a2cc847639ad59af241330f07e320d1ec5f628e8b770baea7657be35e531e36c"} Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.801812 4906 generic.go:334] "Generic (PLEG): container finished" podID="436fc62f-0a9b-46f8-8b63-a2ad9a34dfff" containerID="dfba1818dbbe533e0fb93624d5f7833b1f05c705b76ba98ba5eb973334db8bcd" exitCode=0 Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.801848 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" event={"ID":"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff","Type":"ContainerDied","Data":"dfba1818dbbe533e0fb93624d5f7833b1f05c705b76ba98ba5eb973334db8bcd"} Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.802365 4906 scope.go:117] "RemoveContainer" containerID="dfba1818dbbe533e0fb93624d5f7833b1f05c705b76ba98ba5eb973334db8bcd" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.808136 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" event={"ID":"deec19d6-17bf-41d0-bbe9-5578dce6e7be","Type":"ContainerDied","Data":"023856eec11a4bff7a2493ccf96f4af835f9e2a6ba25c265fe47ef5c03709265"} Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.808156 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-jg6vm" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.808187 4906 scope.go:117] "RemoveContainer" containerID="d1081e6a240b63f416b332e706337c910c94a358dd3f5ffea8c02e3259b90754" Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.930973 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-jg6vm"] Mar 10 00:34:03 crc kubenswrapper[4906]: I0310 00:34:03.936215 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-jg6vm"] Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.083309 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-g96jw"] Mar 10 00:34:04 crc kubenswrapper[4906]: E0310 00:34:04.083561 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deec19d6-17bf-41d0-bbe9-5578dce6e7be" containerName="default-interconnect" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.083572 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="deec19d6-17bf-41d0-bbe9-5578dce6e7be" containerName="default-interconnect" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.083741 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="deec19d6-17bf-41d0-bbe9-5578dce6e7be" containerName="default-interconnect" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.084236 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.085679 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.087819 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.088006 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.088186 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.088314 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.091128 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.092677 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-g96jw"] Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.094243 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-rh2fb" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.201849 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/16322ed1-3150-4dd7-a85b-ae6b330a040d-sasl-config\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.202173 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/16322ed1-3150-4dd7-a85b-ae6b330a040d-sasl-users\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.202204 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/16322ed1-3150-4dd7-a85b-ae6b330a040d-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.202226 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/16322ed1-3150-4dd7-a85b-ae6b330a040d-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.202262 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/16322ed1-3150-4dd7-a85b-ae6b330a040d-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.202283 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cqfz\" (UniqueName: \"kubernetes.io/projected/16322ed1-3150-4dd7-a85b-ae6b330a040d-kube-api-access-4cqfz\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.202302 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/16322ed1-3150-4dd7-a85b-ae6b330a040d-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.206414 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551714-l284f" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.304278 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpzp6\" (UniqueName: \"kubernetes.io/projected/ce0b1165-845d-47b5-b6f8-3d2519066da0-kube-api-access-cpzp6\") pod \"ce0b1165-845d-47b5-b6f8-3d2519066da0\" (UID: \"ce0b1165-845d-47b5-b6f8-3d2519066da0\") " Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.304478 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/16322ed1-3150-4dd7-a85b-ae6b330a040d-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.304544 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/16322ed1-3150-4dd7-a85b-ae6b330a040d-sasl-config\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.304579 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/16322ed1-3150-4dd7-a85b-ae6b330a040d-sasl-users\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.304603 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/16322ed1-3150-4dd7-a85b-ae6b330a040d-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.304620 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/16322ed1-3150-4dd7-a85b-ae6b330a040d-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.304675 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/16322ed1-3150-4dd7-a85b-ae6b330a040d-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.304699 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cqfz\" (UniqueName: \"kubernetes.io/projected/16322ed1-3150-4dd7-a85b-ae6b330a040d-kube-api-access-4cqfz\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.305566 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/16322ed1-3150-4dd7-a85b-ae6b330a040d-sasl-config\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.310547 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/16322ed1-3150-4dd7-a85b-ae6b330a040d-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.310616 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/16322ed1-3150-4dd7-a85b-ae6b330a040d-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.310624 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0b1165-845d-47b5-b6f8-3d2519066da0-kube-api-access-cpzp6" (OuterVolumeSpecName: "kube-api-access-cpzp6") pod "ce0b1165-845d-47b5-b6f8-3d2519066da0" (UID: "ce0b1165-845d-47b5-b6f8-3d2519066da0"). InnerVolumeSpecName "kube-api-access-cpzp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.310721 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/16322ed1-3150-4dd7-a85b-ae6b330a040d-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.313247 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/16322ed1-3150-4dd7-a85b-ae6b330a040d-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.324313 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cqfz\" (UniqueName: \"kubernetes.io/projected/16322ed1-3150-4dd7-a85b-ae6b330a040d-kube-api-access-4cqfz\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.328587 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/16322ed1-3150-4dd7-a85b-ae6b330a040d-sasl-users\") pod \"default-interconnect-68864d46cb-g96jw\" (UID: \"16322ed1-3150-4dd7-a85b-ae6b330a040d\") " pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.406108 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpzp6\" (UniqueName: \"kubernetes.io/projected/ce0b1165-845d-47b5-b6f8-3d2519066da0-kube-api-access-cpzp6\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.497220 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-g96jw" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.593975 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deec19d6-17bf-41d0-bbe9-5578dce6e7be" path="/var/lib/kubelet/pods/deec19d6-17bf-41d0-bbe9-5578dce6e7be/volumes" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.816134 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" event={"ID":"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff","Type":"ContainerStarted","Data":"2a66916abfae909e51bf560545b024a106e8a9743c70bb1fc88d81e85b124132"} Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.819651 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" event={"ID":"216238e4-5dbc-4cc6-838b-d8521159606f","Type":"ContainerStarted","Data":"38e190a32e81e8b4765abf5232592987b3549141672a5acc2d55fb3820508816"} Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.822253 4906 generic.go:334] "Generic (PLEG): container finished" podID="6626cf94-749f-4f6e-8a4e-9c58410aaa46" containerID="3148298c908d09353bd22500a067e51fad8cd5aa743467da9c464c0f8232cc5b" exitCode=0 Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.822315 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" event={"ID":"6626cf94-749f-4f6e-8a4e-9c58410aaa46","Type":"ContainerDied","Data":"3148298c908d09353bd22500a067e51fad8cd5aa743467da9c464c0f8232cc5b"} Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.822366 4906 scope.go:117] "RemoveContainer" containerID="ed71d44ce50559a6e814c1b0e2c04008591b34cd1e35e0a5e65636549414665f" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.822901 4906 scope.go:117] "RemoveContainer" containerID="3148298c908d09353bd22500a067e51fad8cd5aa743467da9c464c0f8232cc5b" Mar 10 00:34:04 crc kubenswrapper[4906]: E0310 00:34:04.823102 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm_service-telemetry(6626cf94-749f-4f6e-8a4e-9c58410aaa46)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" podUID="6626cf94-749f-4f6e-8a4e-9c58410aaa46" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.823774 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551714-l284f" event={"ID":"ce0b1165-845d-47b5-b6f8-3d2519066da0","Type":"ContainerDied","Data":"7ac4d0ef97e55b48782d5e5e6f1a4a6bf019d668a63b55e69795080f20a8f4cb"} Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.823797 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ac4d0ef97e55b48782d5e5e6f1a4a6bf019d668a63b55e69795080f20a8f4cb" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.823950 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551714-l284f" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.826686 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" event={"ID":"c914b63b-284c-4693-b383-0583ea97faaf","Type":"ContainerStarted","Data":"6e7b7a0249dd95dca28d22a98d2d69bca8095a99b5a69cb4f95cd7094d90a841"} Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.829622 4906 generic.go:334] "Generic (PLEG): container finished" podID="17b2c3a4-f715-4e49-baf6-ec674881557e" containerID="a2cc847639ad59af241330f07e320d1ec5f628e8b770baea7657be35e531e36c" exitCode=0 Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.829706 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" event={"ID":"17b2c3a4-f715-4e49-baf6-ec674881557e","Type":"ContainerDied","Data":"a2cc847639ad59af241330f07e320d1ec5f628e8b770baea7657be35e531e36c"} Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.830755 4906 scope.go:117] "RemoveContainer" containerID="a2cc847639ad59af241330f07e320d1ec5f628e8b770baea7657be35e531e36c" Mar 10 00:34:04 crc kubenswrapper[4906]: E0310 00:34:04.831214 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp_service-telemetry(17b2c3a4-f715-4e49-baf6-ec674881557e)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" podUID="17b2c3a4-f715-4e49-baf6-ec674881557e" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.851004 4906 scope.go:117] "RemoveContainer" containerID="e20f43e0fa1ccc000a74f94283064218801c3f6c7b961d924fc066dab6b48a6f" Mar 10 00:34:04 crc kubenswrapper[4906]: I0310 00:34:04.940003 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-g96jw"] Mar 10 00:34:05 crc kubenswrapper[4906]: I0310 00:34:05.264011 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551708-2hp2g"] Mar 10 00:34:05 crc kubenswrapper[4906]: I0310 00:34:05.269397 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551708-2hp2g"] Mar 10 00:34:05 crc kubenswrapper[4906]: I0310 00:34:05.850958 4906 generic.go:334] "Generic (PLEG): container finished" podID="216238e4-5dbc-4cc6-838b-d8521159606f" containerID="38e190a32e81e8b4765abf5232592987b3549141672a5acc2d55fb3820508816" exitCode=0 Mar 10 00:34:05 crc kubenswrapper[4906]: I0310 00:34:05.851032 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" event={"ID":"216238e4-5dbc-4cc6-838b-d8521159606f","Type":"ContainerDied","Data":"38e190a32e81e8b4765abf5232592987b3549141672a5acc2d55fb3820508816"} Mar 10 00:34:05 crc kubenswrapper[4906]: I0310 00:34:05.851073 4906 scope.go:117] "RemoveContainer" containerID="0ecaae6f84964d04b4b0ecf2c1ac53189971bb3753a4112646ecfe07bcf2f922" Mar 10 00:34:05 crc kubenswrapper[4906]: I0310 00:34:05.851980 4906 scope.go:117] "RemoveContainer" containerID="38e190a32e81e8b4765abf5232592987b3549141672a5acc2d55fb3820508816" Mar 10 00:34:05 crc kubenswrapper[4906]: E0310 00:34:05.852274 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z_service-telemetry(216238e4-5dbc-4cc6-838b-d8521159606f)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" podUID="216238e4-5dbc-4cc6-838b-d8521159606f" Mar 10 00:34:05 crc kubenswrapper[4906]: I0310 00:34:05.855621 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-g96jw" event={"ID":"16322ed1-3150-4dd7-a85b-ae6b330a040d","Type":"ContainerStarted","Data":"e1b4b44c356598f62ab6db8848e7efbc9a5937d345dfd0dc7306bcfbea4c5937"} Mar 10 00:34:05 crc kubenswrapper[4906]: I0310 00:34:05.855685 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-g96jw" event={"ID":"16322ed1-3150-4dd7-a85b-ae6b330a040d","Type":"ContainerStarted","Data":"7879643d2c29a5ef3aa78d0679371d2a2bee33c50c9011fbd83aa0c242600568"} Mar 10 00:34:05 crc kubenswrapper[4906]: I0310 00:34:05.860405 4906 generic.go:334] "Generic (PLEG): container finished" podID="c914b63b-284c-4693-b383-0583ea97faaf" containerID="6e7b7a0249dd95dca28d22a98d2d69bca8095a99b5a69cb4f95cd7094d90a841" exitCode=0 Mar 10 00:34:05 crc kubenswrapper[4906]: I0310 00:34:05.860486 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" event={"ID":"c914b63b-284c-4693-b383-0583ea97faaf","Type":"ContainerDied","Data":"6e7b7a0249dd95dca28d22a98d2d69bca8095a99b5a69cb4f95cd7094d90a841"} Mar 10 00:34:05 crc kubenswrapper[4906]: I0310 00:34:05.860850 4906 scope.go:117] "RemoveContainer" containerID="6e7b7a0249dd95dca28d22a98d2d69bca8095a99b5a69cb4f95cd7094d90a841" Mar 10 00:34:05 crc kubenswrapper[4906]: E0310 00:34:05.861067 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr_service-telemetry(c914b63b-284c-4693-b383-0583ea97faaf)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" podUID="c914b63b-284c-4693-b383-0583ea97faaf" Mar 10 00:34:05 crc kubenswrapper[4906]: I0310 00:34:05.873024 4906 generic.go:334] "Generic (PLEG): container finished" podID="436fc62f-0a9b-46f8-8b63-a2ad9a34dfff" containerID="2a66916abfae909e51bf560545b024a106e8a9743c70bb1fc88d81e85b124132" exitCode=0 Mar 10 00:34:05 crc kubenswrapper[4906]: I0310 00:34:05.873299 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" event={"ID":"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff","Type":"ContainerDied","Data":"2a66916abfae909e51bf560545b024a106e8a9743c70bb1fc88d81e85b124132"} Mar 10 00:34:05 crc kubenswrapper[4906]: I0310 00:34:05.873980 4906 scope.go:117] "RemoveContainer" containerID="2a66916abfae909e51bf560545b024a106e8a9743c70bb1fc88d81e85b124132" Mar 10 00:34:05 crc kubenswrapper[4906]: E0310 00:34:05.874298 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46_service-telemetry(436fc62f-0a9b-46f8-8b63-a2ad9a34dfff)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" podUID="436fc62f-0a9b-46f8-8b63-a2ad9a34dfff" Mar 10 00:34:05 crc kubenswrapper[4906]: I0310 00:34:05.900725 4906 scope.go:117] "RemoveContainer" containerID="08066d95d9100c3283f1e3fa939375fd446b5dcf7751a8887667c423e2d050ae" Mar 10 00:34:05 crc kubenswrapper[4906]: I0310 00:34:05.930493 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-g96jw" podStartSLOduration=3.930472088 podStartE2EDuration="3.930472088s" podCreationTimestamp="2026-03-10 00:34:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 00:34:05.925150569 +0000 UTC m=+1672.073045681" watchObservedRunningTime="2026-03-10 00:34:05.930472088 +0000 UTC m=+1672.078367200" Mar 10 00:34:05 crc kubenswrapper[4906]: I0310 00:34:05.960938 4906 scope.go:117] "RemoveContainer" containerID="dfba1818dbbe533e0fb93624d5f7833b1f05c705b76ba98ba5eb973334db8bcd" Mar 10 00:34:06 crc kubenswrapper[4906]: I0310 00:34:06.585127 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5f5c4bb-bcea-4d6a-8732-a69f6d373952" path="/var/lib/kubelet/pods/d5f5c4bb-bcea-4d6a-8732-a69f6d373952/volumes" Mar 10 00:34:06 crc kubenswrapper[4906]: I0310 00:34:06.996532 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Mar 10 00:34:06 crc kubenswrapper[4906]: E0310 00:34:06.996811 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0b1165-845d-47b5-b6f8-3d2519066da0" containerName="oc" Mar 10 00:34:06 crc kubenswrapper[4906]: I0310 00:34:06.996824 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0b1165-845d-47b5-b6f8-3d2519066da0" containerName="oc" Mar 10 00:34:06 crc kubenswrapper[4906]: I0310 00:34:06.996944 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0b1165-845d-47b5-b6f8-3d2519066da0" containerName="oc" Mar 10 00:34:06 crc kubenswrapper[4906]: I0310 00:34:06.997343 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 10 00:34:06 crc kubenswrapper[4906]: I0310 00:34:06.999901 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Mar 10 00:34:07 crc kubenswrapper[4906]: I0310 00:34:07.000110 4906 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Mar 10 00:34:07 crc kubenswrapper[4906]: I0310 00:34:07.012455 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 10 00:34:07 crc kubenswrapper[4906]: I0310 00:34:07.045155 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/44c504b9-118f-4b8b-ad28-f925b5e522bb-qdr-test-config\") pod \"qdr-test\" (UID: \"44c504b9-118f-4b8b-ad28-f925b5e522bb\") " pod="service-telemetry/qdr-test" Mar 10 00:34:07 crc kubenswrapper[4906]: I0310 00:34:07.045222 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/44c504b9-118f-4b8b-ad28-f925b5e522bb-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"44c504b9-118f-4b8b-ad28-f925b5e522bb\") " pod="service-telemetry/qdr-test" Mar 10 00:34:07 crc kubenswrapper[4906]: I0310 00:34:07.045267 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmtw6\" (UniqueName: \"kubernetes.io/projected/44c504b9-118f-4b8b-ad28-f925b5e522bb-kube-api-access-pmtw6\") pod \"qdr-test\" (UID: \"44c504b9-118f-4b8b-ad28-f925b5e522bb\") " pod="service-telemetry/qdr-test" Mar 10 00:34:07 crc kubenswrapper[4906]: I0310 00:34:07.146585 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/44c504b9-118f-4b8b-ad28-f925b5e522bb-qdr-test-config\") pod \"qdr-test\" (UID: \"44c504b9-118f-4b8b-ad28-f925b5e522bb\") " pod="service-telemetry/qdr-test" Mar 10 00:34:07 crc kubenswrapper[4906]: I0310 00:34:07.146664 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/44c504b9-118f-4b8b-ad28-f925b5e522bb-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"44c504b9-118f-4b8b-ad28-f925b5e522bb\") " pod="service-telemetry/qdr-test" Mar 10 00:34:07 crc kubenswrapper[4906]: I0310 00:34:07.146711 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmtw6\" (UniqueName: \"kubernetes.io/projected/44c504b9-118f-4b8b-ad28-f925b5e522bb-kube-api-access-pmtw6\") pod \"qdr-test\" (UID: \"44c504b9-118f-4b8b-ad28-f925b5e522bb\") " pod="service-telemetry/qdr-test" Mar 10 00:34:07 crc kubenswrapper[4906]: I0310 00:34:07.147659 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/44c504b9-118f-4b8b-ad28-f925b5e522bb-qdr-test-config\") pod \"qdr-test\" (UID: \"44c504b9-118f-4b8b-ad28-f925b5e522bb\") " pod="service-telemetry/qdr-test" Mar 10 00:34:07 crc kubenswrapper[4906]: I0310 00:34:07.153026 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/44c504b9-118f-4b8b-ad28-f925b5e522bb-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"44c504b9-118f-4b8b-ad28-f925b5e522bb\") " pod="service-telemetry/qdr-test" Mar 10 00:34:07 crc kubenswrapper[4906]: I0310 00:34:07.163761 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmtw6\" (UniqueName: \"kubernetes.io/projected/44c504b9-118f-4b8b-ad28-f925b5e522bb-kube-api-access-pmtw6\") pod \"qdr-test\" (UID: \"44c504b9-118f-4b8b-ad28-f925b5e522bb\") " pod="service-telemetry/qdr-test" Mar 10 00:34:07 crc kubenswrapper[4906]: I0310 00:34:07.311914 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 10 00:34:07 crc kubenswrapper[4906]: I0310 00:34:07.518828 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 10 00:34:07 crc kubenswrapper[4906]: I0310 00:34:07.894329 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"44c504b9-118f-4b8b-ad28-f925b5e522bb","Type":"ContainerStarted","Data":"a82bd3e26f7cd0801447df15c16c567756e32b71717763ef93eb535fecc4d9de"} Mar 10 00:34:11 crc kubenswrapper[4906]: I0310 00:34:11.576721 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:34:11 crc kubenswrapper[4906]: E0310 00:34:11.577264 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:34:15 crc kubenswrapper[4906]: I0310 00:34:15.965881 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"44c504b9-118f-4b8b-ad28-f925b5e522bb","Type":"ContainerStarted","Data":"136b80118ce74ebb9a0bbe28ae42cf5e11f954678248bf29f5e9ab311d3b5eb8"} Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.009111 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=2.498572103 podStartE2EDuration="10.009092964s" podCreationTimestamp="2026-03-10 00:34:06 +0000 UTC" firstStartedPulling="2026-03-10 00:34:07.534783201 +0000 UTC m=+1673.682678323" lastFinishedPulling="2026-03-10 00:34:15.045304072 +0000 UTC m=+1681.193199184" observedRunningTime="2026-03-10 00:34:15.993482206 +0000 UTC m=+1682.141377318" watchObservedRunningTime="2026-03-10 00:34:16.009092964 +0000 UTC m=+1682.156988076" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.328135 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-z24nw"] Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.329114 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.332719 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.332882 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.333095 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.333566 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.333630 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.335832 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.351046 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-z24nw"] Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.386104 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.386271 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.386369 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfp6p\" (UniqueName: \"kubernetes.io/projected/0545a410-bd7a-4e49-ab46-73fe128ecca9-kube-api-access-mfp6p\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.386438 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-sensubility-config\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.386472 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-collectd-config\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.386528 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-ceilometer-publisher\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.386586 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-healthcheck-log\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.487988 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.488095 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfp6p\" (UniqueName: \"kubernetes.io/projected/0545a410-bd7a-4e49-ab46-73fe128ecca9-kube-api-access-mfp6p\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.488692 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-sensubility-config\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.488820 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-collectd-config\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.488903 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-ceilometer-publisher\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.489006 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-healthcheck-log\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.489073 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.489711 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.489738 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-sensubility-config\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.490136 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-collectd-config\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.490688 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-healthcheck-log\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.490836 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-ceilometer-publisher\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.491728 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.521685 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfp6p\" (UniqueName: \"kubernetes.io/projected/0545a410-bd7a-4e49-ab46-73fe128ecca9-kube-api-access-mfp6p\") pod \"stf-smoketest-smoke1-z24nw\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.576878 4906 scope.go:117] "RemoveContainer" containerID="38e190a32e81e8b4765abf5232592987b3549141672a5acc2d55fb3820508816" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.643670 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.790917 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.791894 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.804228 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.896044 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q9bp\" (UniqueName: \"kubernetes.io/projected/92f8c05e-39c6-439a-ac06-984e5abe92a9-kube-api-access-6q9bp\") pod \"curl\" (UID: \"92f8c05e-39c6-439a-ac06-984e5abe92a9\") " pod="service-telemetry/curl" Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.991364 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z" event={"ID":"216238e4-5dbc-4cc6-838b-d8521159606f","Type":"ContainerStarted","Data":"270e4e3c64a4d8f1b680f7218d439a8ffd55372103aebdf7084ca11b50fb2408"} Mar 10 00:34:16 crc kubenswrapper[4906]: I0310 00:34:16.997629 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q9bp\" (UniqueName: \"kubernetes.io/projected/92f8c05e-39c6-439a-ac06-984e5abe92a9-kube-api-access-6q9bp\") pod \"curl\" (UID: \"92f8c05e-39c6-439a-ac06-984e5abe92a9\") " pod="service-telemetry/curl" Mar 10 00:34:17 crc kubenswrapper[4906]: I0310 00:34:17.020625 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q9bp\" (UniqueName: \"kubernetes.io/projected/92f8c05e-39c6-439a-ac06-984e5abe92a9-kube-api-access-6q9bp\") pod \"curl\" (UID: \"92f8c05e-39c6-439a-ac06-984e5abe92a9\") " pod="service-telemetry/curl" Mar 10 00:34:17 crc kubenswrapper[4906]: I0310 00:34:17.098139 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-z24nw"] Mar 10 00:34:17 crc kubenswrapper[4906]: W0310 00:34:17.103100 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0545a410_bd7a_4e49_ab46_73fe128ecca9.slice/crio-bcd014265837480e980cbcb2bd4842bc5652131136e717b58122f47ef42cf3a1 WatchSource:0}: Error finding container bcd014265837480e980cbcb2bd4842bc5652131136e717b58122f47ef42cf3a1: Status 404 returned error can't find the container with id bcd014265837480e980cbcb2bd4842bc5652131136e717b58122f47ef42cf3a1 Mar 10 00:34:17 crc kubenswrapper[4906]: I0310 00:34:17.112014 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 10 00:34:17 crc kubenswrapper[4906]: I0310 00:34:17.523475 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 10 00:34:17 crc kubenswrapper[4906]: W0310 00:34:17.531620 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92f8c05e_39c6_439a_ac06_984e5abe92a9.slice/crio-a3ce39c466711fc6889261ee7a657cbe24d44ea4f442a421e4ef43464dd6a2d6 WatchSource:0}: Error finding container a3ce39c466711fc6889261ee7a657cbe24d44ea4f442a421e4ef43464dd6a2d6: Status 404 returned error can't find the container with id a3ce39c466711fc6889261ee7a657cbe24d44ea4f442a421e4ef43464dd6a2d6 Mar 10 00:34:18 crc kubenswrapper[4906]: I0310 00:34:18.000093 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"92f8c05e-39c6-439a-ac06-984e5abe92a9","Type":"ContainerStarted","Data":"a3ce39c466711fc6889261ee7a657cbe24d44ea4f442a421e4ef43464dd6a2d6"} Mar 10 00:34:18 crc kubenswrapper[4906]: I0310 00:34:18.000831 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-z24nw" event={"ID":"0545a410-bd7a-4e49-ab46-73fe128ecca9","Type":"ContainerStarted","Data":"bcd014265837480e980cbcb2bd4842bc5652131136e717b58122f47ef42cf3a1"} Mar 10 00:34:18 crc kubenswrapper[4906]: I0310 00:34:18.577337 4906 scope.go:117] "RemoveContainer" containerID="6e7b7a0249dd95dca28d22a98d2d69bca8095a99b5a69cb4f95cd7094d90a841" Mar 10 00:34:18 crc kubenswrapper[4906]: I0310 00:34:18.578372 4906 scope.go:117] "RemoveContainer" containerID="a2cc847639ad59af241330f07e320d1ec5f628e8b770baea7657be35e531e36c" Mar 10 00:34:18 crc kubenswrapper[4906]: I0310 00:34:18.578582 4906 scope.go:117] "RemoveContainer" containerID="3148298c908d09353bd22500a067e51fad8cd5aa743467da9c464c0f8232cc5b" Mar 10 00:34:20 crc kubenswrapper[4906]: I0310 00:34:20.023158 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr" event={"ID":"c914b63b-284c-4693-b383-0583ea97faaf","Type":"ContainerStarted","Data":"42de086c3c0f8405235645b86a135509e1512d954271ff53ff283177202996f5"} Mar 10 00:34:20 crc kubenswrapper[4906]: I0310 00:34:20.026989 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp" event={"ID":"17b2c3a4-f715-4e49-baf6-ec674881557e","Type":"ContainerStarted","Data":"848e33580feffb790d46d043c46fbe95014e7353b29c82c5bb33ec1530ed54b1"} Mar 10 00:34:20 crc kubenswrapper[4906]: I0310 00:34:20.039690 4906 generic.go:334] "Generic (PLEG): container finished" podID="92f8c05e-39c6-439a-ac06-984e5abe92a9" containerID="d21e4512814e5bc02501edd6fe4624be9842a5d0f1f1686ebf9663264ac275ed" exitCode=0 Mar 10 00:34:20 crc kubenswrapper[4906]: I0310 00:34:20.039981 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"92f8c05e-39c6-439a-ac06-984e5abe92a9","Type":"ContainerDied","Data":"d21e4512814e5bc02501edd6fe4624be9842a5d0f1f1686ebf9663264ac275ed"} Mar 10 00:34:20 crc kubenswrapper[4906]: I0310 00:34:20.042921 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm" event={"ID":"6626cf94-749f-4f6e-8a4e-9c58410aaa46","Type":"ContainerStarted","Data":"e5c2a9539ef8dca8ce2b067cf7186046d0887be7e5159bef16d44a950966fd26"} Mar 10 00:34:20 crc kubenswrapper[4906]: I0310 00:34:20.576627 4906 scope.go:117] "RemoveContainer" containerID="2a66916abfae909e51bf560545b024a106e8a9743c70bb1fc88d81e85b124132" Mar 10 00:34:26 crc kubenswrapper[4906]: I0310 00:34:26.576551 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:34:26 crc kubenswrapper[4906]: E0310 00:34:26.577190 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:34:26 crc kubenswrapper[4906]: I0310 00:34:26.614428 4906 scope.go:117] "RemoveContainer" containerID="92b171299d2ef5e48eeccb6789efb415f5892340d4770ec912aef8d2844b4fa4" Mar 10 00:34:28 crc kubenswrapper[4906]: I0310 00:34:28.773664 4906 scope.go:117] "RemoveContainer" containerID="7f7e4ecad566630fe6adcd2e8388d087436c672f8989c4cb49b3fe3637eafc12" Mar 10 00:34:28 crc kubenswrapper[4906]: I0310 00:34:28.825526 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 10 00:34:28 crc kubenswrapper[4906]: I0310 00:34:28.831267 4906 scope.go:117] "RemoveContainer" containerID="80f5294adf972196dc5188a3767901e794ef66f50e16a446c09bbf7826241001" Mar 10 00:34:28 crc kubenswrapper[4906]: I0310 00:34:28.953498 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q9bp\" (UniqueName: \"kubernetes.io/projected/92f8c05e-39c6-439a-ac06-984e5abe92a9-kube-api-access-6q9bp\") pod \"92f8c05e-39c6-439a-ac06-984e5abe92a9\" (UID: \"92f8c05e-39c6-439a-ac06-984e5abe92a9\") " Mar 10 00:34:28 crc kubenswrapper[4906]: I0310 00:34:28.957015 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92f8c05e-39c6-439a-ac06-984e5abe92a9-kube-api-access-6q9bp" (OuterVolumeSpecName: "kube-api-access-6q9bp") pod "92f8c05e-39c6-439a-ac06-984e5abe92a9" (UID: "92f8c05e-39c6-439a-ac06-984e5abe92a9"). InnerVolumeSpecName "kube-api-access-6q9bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:34:29 crc kubenswrapper[4906]: I0310 00:34:29.007197 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_92f8c05e-39c6-439a-ac06-984e5abe92a9/curl/0.log" Mar 10 00:34:29 crc kubenswrapper[4906]: I0310 00:34:29.056516 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q9bp\" (UniqueName: \"kubernetes.io/projected/92f8c05e-39c6-439a-ac06-984e5abe92a9-kube-api-access-6q9bp\") on node \"crc\" DevicePath \"\"" Mar 10 00:34:29 crc kubenswrapper[4906]: I0310 00:34:29.097596 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 10 00:34:29 crc kubenswrapper[4906]: I0310 00:34:29.097582 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"92f8c05e-39c6-439a-ac06-984e5abe92a9","Type":"ContainerDied","Data":"a3ce39c466711fc6889261ee7a657cbe24d44ea4f442a421e4ef43464dd6a2d6"} Mar 10 00:34:29 crc kubenswrapper[4906]: I0310 00:34:29.097678 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3ce39c466711fc6889261ee7a657cbe24d44ea4f442a421e4ef43464dd6a2d6" Mar 10 00:34:29 crc kubenswrapper[4906]: I0310 00:34:29.099460 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-z24nw" event={"ID":"0545a410-bd7a-4e49-ab46-73fe128ecca9","Type":"ContainerStarted","Data":"83a62760b121f1e29894b7a47655f37d79e2d94fab464d5be2e1e6d9bc52ffc7"} Mar 10 00:34:29 crc kubenswrapper[4906]: I0310 00:34:29.101977 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46" event={"ID":"436fc62f-0a9b-46f8-8b63-a2ad9a34dfff","Type":"ContainerStarted","Data":"22c83a25bacd2a602f8bdba1378c68c914886bdd11649744d54a45d0f1f058be"} Mar 10 00:34:29 crc kubenswrapper[4906]: I0310 00:34:29.332741 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-zsbkd_b7a5d692-659c-4912-b290-4a4010f370b6/prometheus-webhook-snmp/0.log" Mar 10 00:34:35 crc kubenswrapper[4906]: I0310 00:34:35.152300 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-z24nw" event={"ID":"0545a410-bd7a-4e49-ab46-73fe128ecca9","Type":"ContainerStarted","Data":"e61530ef100020acd9e5ae6753dc2f3c0bd5bb9eaad5c21070d782d684722bdd"} Mar 10 00:34:35 crc kubenswrapper[4906]: I0310 00:34:35.180294 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-z24nw" podStartSLOduration=1.8437439150000001 podStartE2EDuration="19.180268457s" podCreationTimestamp="2026-03-10 00:34:16 +0000 UTC" firstStartedPulling="2026-03-10 00:34:17.106641863 +0000 UTC m=+1683.254536975" lastFinishedPulling="2026-03-10 00:34:34.443166405 +0000 UTC m=+1700.591061517" observedRunningTime="2026-03-10 00:34:35.175096292 +0000 UTC m=+1701.322991424" watchObservedRunningTime="2026-03-10 00:34:35.180268457 +0000 UTC m=+1701.328163599" Mar 10 00:34:37 crc kubenswrapper[4906]: I0310 00:34:37.577193 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:34:37 crc kubenswrapper[4906]: E0310 00:34:37.577891 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:34:52 crc kubenswrapper[4906]: I0310 00:34:52.577029 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:34:52 crc kubenswrapper[4906]: E0310 00:34:52.578134 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:34:59 crc kubenswrapper[4906]: I0310 00:34:59.509273 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-zsbkd_b7a5d692-659c-4912-b290-4a4010f370b6/prometheus-webhook-snmp/0.log" Mar 10 00:35:03 crc kubenswrapper[4906]: I0310 00:35:03.378849 4906 generic.go:334] "Generic (PLEG): container finished" podID="0545a410-bd7a-4e49-ab46-73fe128ecca9" containerID="83a62760b121f1e29894b7a47655f37d79e2d94fab464d5be2e1e6d9bc52ffc7" exitCode=0 Mar 10 00:35:03 crc kubenswrapper[4906]: I0310 00:35:03.379138 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-z24nw" event={"ID":"0545a410-bd7a-4e49-ab46-73fe128ecca9","Type":"ContainerDied","Data":"83a62760b121f1e29894b7a47655f37d79e2d94fab464d5be2e1e6d9bc52ffc7"} Mar 10 00:35:03 crc kubenswrapper[4906]: I0310 00:35:03.379581 4906 scope.go:117] "RemoveContainer" containerID="83a62760b121f1e29894b7a47655f37d79e2d94fab464d5be2e1e6d9bc52ffc7" Mar 10 00:35:05 crc kubenswrapper[4906]: I0310 00:35:05.577204 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:35:05 crc kubenswrapper[4906]: E0310 00:35:05.577462 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:35:06 crc kubenswrapper[4906]: I0310 00:35:06.412197 4906 generic.go:334] "Generic (PLEG): container finished" podID="0545a410-bd7a-4e49-ab46-73fe128ecca9" containerID="e61530ef100020acd9e5ae6753dc2f3c0bd5bb9eaad5c21070d782d684722bdd" exitCode=0 Mar 10 00:35:06 crc kubenswrapper[4906]: I0310 00:35:06.412250 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-z24nw" event={"ID":"0545a410-bd7a-4e49-ab46-73fe128ecca9","Type":"ContainerDied","Data":"e61530ef100020acd9e5ae6753dc2f3c0bd5bb9eaad5c21070d782d684722bdd"} Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.712053 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.796993 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-collectd-entrypoint-script\") pod \"0545a410-bd7a-4e49-ab46-73fe128ecca9\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.797121 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-healthcheck-log\") pod \"0545a410-bd7a-4e49-ab46-73fe128ecca9\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.797153 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfp6p\" (UniqueName: \"kubernetes.io/projected/0545a410-bd7a-4e49-ab46-73fe128ecca9-kube-api-access-mfp6p\") pod \"0545a410-bd7a-4e49-ab46-73fe128ecca9\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.797891 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-collectd-config\") pod \"0545a410-bd7a-4e49-ab46-73fe128ecca9\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.797915 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-ceilometer-publisher\") pod \"0545a410-bd7a-4e49-ab46-73fe128ecca9\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.797979 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-sensubility-config\") pod \"0545a410-bd7a-4e49-ab46-73fe128ecca9\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.798005 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-ceilometer-entrypoint-script\") pod \"0545a410-bd7a-4e49-ab46-73fe128ecca9\" (UID: \"0545a410-bd7a-4e49-ab46-73fe128ecca9\") " Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.816994 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0545a410-bd7a-4e49-ab46-73fe128ecca9-kube-api-access-mfp6p" (OuterVolumeSpecName: "kube-api-access-mfp6p") pod "0545a410-bd7a-4e49-ab46-73fe128ecca9" (UID: "0545a410-bd7a-4e49-ab46-73fe128ecca9"). InnerVolumeSpecName "kube-api-access-mfp6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.817881 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfp6p\" (UniqueName: \"kubernetes.io/projected/0545a410-bd7a-4e49-ab46-73fe128ecca9-kube-api-access-mfp6p\") on node \"crc\" DevicePath \"\"" Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.822327 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "0545a410-bd7a-4e49-ab46-73fe128ecca9" (UID: "0545a410-bd7a-4e49-ab46-73fe128ecca9"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.827515 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "0545a410-bd7a-4e49-ab46-73fe128ecca9" (UID: "0545a410-bd7a-4e49-ab46-73fe128ecca9"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.828495 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "0545a410-bd7a-4e49-ab46-73fe128ecca9" (UID: "0545a410-bd7a-4e49-ab46-73fe128ecca9"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.829301 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "0545a410-bd7a-4e49-ab46-73fe128ecca9" (UID: "0545a410-bd7a-4e49-ab46-73fe128ecca9"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.839587 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "0545a410-bd7a-4e49-ab46-73fe128ecca9" (UID: "0545a410-bd7a-4e49-ab46-73fe128ecca9"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.840926 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "0545a410-bd7a-4e49-ab46-73fe128ecca9" (UID: "0545a410-bd7a-4e49-ab46-73fe128ecca9"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.920102 4906 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.920160 4906 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-healthcheck-log\") on node \"crc\" DevicePath \"\"" Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.920179 4906 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-collectd-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.920197 4906 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.920214 4906 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-sensubility-config\") on node \"crc\" DevicePath \"\"" Mar 10 00:35:07 crc kubenswrapper[4906]: I0310 00:35:07.920232 4906 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/0545a410-bd7a-4e49-ab46-73fe128ecca9-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 10 00:35:08 crc kubenswrapper[4906]: I0310 00:35:08.430522 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-z24nw" event={"ID":"0545a410-bd7a-4e49-ab46-73fe128ecca9","Type":"ContainerDied","Data":"bcd014265837480e980cbcb2bd4842bc5652131136e717b58122f47ef42cf3a1"} Mar 10 00:35:08 crc kubenswrapper[4906]: I0310 00:35:08.430569 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcd014265837480e980cbcb2bd4842bc5652131136e717b58122f47ef42cf3a1" Mar 10 00:35:08 crc kubenswrapper[4906]: I0310 00:35:08.430678 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-z24nw" Mar 10 00:35:09 crc kubenswrapper[4906]: I0310 00:35:09.685668 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-z24nw_0545a410-bd7a-4e49-ab46-73fe128ecca9/smoketest-collectd/0.log" Mar 10 00:35:10 crc kubenswrapper[4906]: I0310 00:35:10.015564 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-z24nw_0545a410-bd7a-4e49-ab46-73fe128ecca9/smoketest-ceilometer/0.log" Mar 10 00:35:10 crc kubenswrapper[4906]: I0310 00:35:10.288017 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-g96jw_16322ed1-3150-4dd7-a85b-ae6b330a040d/default-interconnect/0.log" Mar 10 00:35:10 crc kubenswrapper[4906]: I0310 00:35:10.614122 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp_17b2c3a4-f715-4e49-baf6-ec674881557e/bridge/2.log" Mar 10 00:35:10 crc kubenswrapper[4906]: I0310 00:35:10.906937 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-nvklp_17b2c3a4-f715-4e49-baf6-ec674881557e/sg-core/0.log" Mar 10 00:35:11 crc kubenswrapper[4906]: I0310 00:35:11.229918 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr_c914b63b-284c-4693-b383-0583ea97faaf/bridge/2.log" Mar 10 00:35:11 crc kubenswrapper[4906]: I0310 00:35:11.506833 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-6d5f89bf86-x79fr_c914b63b-284c-4693-b383-0583ea97faaf/sg-core/0.log" Mar 10 00:35:11 crc kubenswrapper[4906]: I0310 00:35:11.801600 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm_6626cf94-749f-4f6e-8a4e-9c58410aaa46/bridge/2.log" Mar 10 00:35:12 crc kubenswrapper[4906]: I0310 00:35:12.119852 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-8z7rm_6626cf94-749f-4f6e-8a4e-9c58410aaa46/sg-core/0.log" Mar 10 00:35:12 crc kubenswrapper[4906]: I0310 00:35:12.423180 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46_436fc62f-0a9b-46f8-8b63-a2ad9a34dfff/bridge/2.log" Mar 10 00:35:12 crc kubenswrapper[4906]: I0310 00:35:12.800490 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-c54b5cd7b-2mf46_436fc62f-0a9b-46f8-8b63-a2ad9a34dfff/sg-core/0.log" Mar 10 00:35:13 crc kubenswrapper[4906]: I0310 00:35:13.121427 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z_216238e4-5dbc-4cc6-838b-d8521159606f/bridge/2.log" Mar 10 00:35:13 crc kubenswrapper[4906]: I0310 00:35:13.435948 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-n8m4z_216238e4-5dbc-4cc6-838b-d8521159606f/sg-core/0.log" Mar 10 00:35:15 crc kubenswrapper[4906]: I0310 00:35:15.542297 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-76f76f667-9zzn2_d2eb5756-28ad-445b-b3f8-1fa0846cd41b/operator/0.log" Mar 10 00:35:15 crc kubenswrapper[4906]: I0310 00:35:15.856247 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_7a0470ff-1325-4c14-8b7d-93f6ecad34e6/prometheus/0.log" Mar 10 00:35:16 crc kubenswrapper[4906]: I0310 00:35:16.127580 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_90a4eb11-c2c3-4d86-8cac-ddccdbab507a/elasticsearch/0.log" Mar 10 00:35:16 crc kubenswrapper[4906]: I0310 00:35:16.404234 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-zsbkd_b7a5d692-659c-4912-b290-4a4010f370b6/prometheus-webhook-snmp/0.log" Mar 10 00:35:16 crc kubenswrapper[4906]: I0310 00:35:16.677358 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_f531aabb-4f0a-49a6-95ed-f5e3c04ca148/alertmanager/0.log" Mar 10 00:35:20 crc kubenswrapper[4906]: I0310 00:35:20.577840 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:35:20 crc kubenswrapper[4906]: E0310 00:35:20.578401 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:35:33 crc kubenswrapper[4906]: I0310 00:35:33.235908 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-6b6c8c999c-7wl2m_94d067fb-9889-49b0-b722-ede22e5be330/operator/0.log" Mar 10 00:35:33 crc kubenswrapper[4906]: I0310 00:35:33.576419 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:35:33 crc kubenswrapper[4906]: E0310 00:35:33.577019 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:35:36 crc kubenswrapper[4906]: I0310 00:35:36.333449 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-76f76f667-9zzn2_d2eb5756-28ad-445b-b3f8-1fa0846cd41b/operator/0.log" Mar 10 00:35:36 crc kubenswrapper[4906]: I0310 00:35:36.628670 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_44c504b9-118f-4b8b-ad28-f925b5e522bb/qdr/0.log" Mar 10 00:35:48 crc kubenswrapper[4906]: I0310 00:35:48.577513 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:35:48 crc kubenswrapper[4906]: E0310 00:35:48.578322 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:36:00 crc kubenswrapper[4906]: I0310 00:36:00.143272 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551716-kmxb7"] Mar 10 00:36:00 crc kubenswrapper[4906]: E0310 00:36:00.145405 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0545a410-bd7a-4e49-ab46-73fe128ecca9" containerName="smoketest-collectd" Mar 10 00:36:00 crc kubenswrapper[4906]: I0310 00:36:00.145503 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="0545a410-bd7a-4e49-ab46-73fe128ecca9" containerName="smoketest-collectd" Mar 10 00:36:00 crc kubenswrapper[4906]: E0310 00:36:00.145583 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0545a410-bd7a-4e49-ab46-73fe128ecca9" containerName="smoketest-ceilometer" Mar 10 00:36:00 crc kubenswrapper[4906]: I0310 00:36:00.145692 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="0545a410-bd7a-4e49-ab46-73fe128ecca9" containerName="smoketest-ceilometer" Mar 10 00:36:00 crc kubenswrapper[4906]: E0310 00:36:00.145772 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92f8c05e-39c6-439a-ac06-984e5abe92a9" containerName="curl" Mar 10 00:36:00 crc kubenswrapper[4906]: I0310 00:36:00.145838 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="92f8c05e-39c6-439a-ac06-984e5abe92a9" containerName="curl" Mar 10 00:36:00 crc kubenswrapper[4906]: I0310 00:36:00.146075 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="0545a410-bd7a-4e49-ab46-73fe128ecca9" containerName="smoketest-ceilometer" Mar 10 00:36:00 crc kubenswrapper[4906]: I0310 00:36:00.146177 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="92f8c05e-39c6-439a-ac06-984e5abe92a9" containerName="curl" Mar 10 00:36:00 crc kubenswrapper[4906]: I0310 00:36:00.146266 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="0545a410-bd7a-4e49-ab46-73fe128ecca9" containerName="smoketest-collectd" Mar 10 00:36:00 crc kubenswrapper[4906]: I0310 00:36:00.146945 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551716-kmxb7" Mar 10 00:36:00 crc kubenswrapper[4906]: I0310 00:36:00.149062 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ktdr6" Mar 10 00:36:00 crc kubenswrapper[4906]: I0310 00:36:00.149240 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:36:00 crc kubenswrapper[4906]: I0310 00:36:00.150949 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:36:00 crc kubenswrapper[4906]: I0310 00:36:00.152182 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551716-kmxb7"] Mar 10 00:36:00 crc kubenswrapper[4906]: I0310 00:36:00.312880 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q5v8\" (UniqueName: \"kubernetes.io/projected/175ba419-d912-49ca-8706-4e6b5ca2eeca-kube-api-access-8q5v8\") pod \"auto-csr-approver-29551716-kmxb7\" (UID: \"175ba419-d912-49ca-8706-4e6b5ca2eeca\") " pod="openshift-infra/auto-csr-approver-29551716-kmxb7" Mar 10 00:36:00 crc kubenswrapper[4906]: I0310 00:36:00.413831 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q5v8\" (UniqueName: \"kubernetes.io/projected/175ba419-d912-49ca-8706-4e6b5ca2eeca-kube-api-access-8q5v8\") pod \"auto-csr-approver-29551716-kmxb7\" (UID: \"175ba419-d912-49ca-8706-4e6b5ca2eeca\") " pod="openshift-infra/auto-csr-approver-29551716-kmxb7" Mar 10 00:36:00 crc kubenswrapper[4906]: I0310 00:36:00.446290 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q5v8\" (UniqueName: \"kubernetes.io/projected/175ba419-d912-49ca-8706-4e6b5ca2eeca-kube-api-access-8q5v8\") pod \"auto-csr-approver-29551716-kmxb7\" (UID: \"175ba419-d912-49ca-8706-4e6b5ca2eeca\") " pod="openshift-infra/auto-csr-approver-29551716-kmxb7" Mar 10 00:36:00 crc kubenswrapper[4906]: I0310 00:36:00.476006 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551716-kmxb7" Mar 10 00:36:00 crc kubenswrapper[4906]: I0310 00:36:00.917030 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551716-kmxb7"] Mar 10 00:36:01 crc kubenswrapper[4906]: I0310 00:36:01.576695 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:36:01 crc kubenswrapper[4906]: E0310 00:36:01.576910 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:36:01 crc kubenswrapper[4906]: I0310 00:36:01.880745 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551716-kmxb7" event={"ID":"175ba419-d912-49ca-8706-4e6b5ca2eeca","Type":"ContainerStarted","Data":"5429c7bac1c96c688f5edf640a2639cff2d7ccef51af8877ccbf26af571e4935"} Mar 10 00:36:02 crc kubenswrapper[4906]: I0310 00:36:02.648742 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ffwsk/must-gather-sgtxk"] Mar 10 00:36:02 crc kubenswrapper[4906]: I0310 00:36:02.651028 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ffwsk/must-gather-sgtxk" Mar 10 00:36:02 crc kubenswrapper[4906]: I0310 00:36:02.654391 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ffwsk"/"default-dockercfg-pzdzl" Mar 10 00:36:02 crc kubenswrapper[4906]: I0310 00:36:02.654860 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ffwsk"/"kube-root-ca.crt" Mar 10 00:36:02 crc kubenswrapper[4906]: I0310 00:36:02.655156 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ffwsk"/"openshift-service-ca.crt" Mar 10 00:36:02 crc kubenswrapper[4906]: I0310 00:36:02.658551 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ffwsk/must-gather-sgtxk"] Mar 10 00:36:02 crc kubenswrapper[4906]: I0310 00:36:02.770578 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8n9c\" (UniqueName: \"kubernetes.io/projected/bc2480c1-8758-4811-b5a0-f4dcc7884a39-kube-api-access-v8n9c\") pod \"must-gather-sgtxk\" (UID: \"bc2480c1-8758-4811-b5a0-f4dcc7884a39\") " pod="openshift-must-gather-ffwsk/must-gather-sgtxk" Mar 10 00:36:02 crc kubenswrapper[4906]: I0310 00:36:02.770972 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bc2480c1-8758-4811-b5a0-f4dcc7884a39-must-gather-output\") pod \"must-gather-sgtxk\" (UID: \"bc2480c1-8758-4811-b5a0-f4dcc7884a39\") " pod="openshift-must-gather-ffwsk/must-gather-sgtxk" Mar 10 00:36:02 crc kubenswrapper[4906]: I0310 00:36:02.872854 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bc2480c1-8758-4811-b5a0-f4dcc7884a39-must-gather-output\") pod \"must-gather-sgtxk\" (UID: \"bc2480c1-8758-4811-b5a0-f4dcc7884a39\") " pod="openshift-must-gather-ffwsk/must-gather-sgtxk" Mar 10 00:36:02 crc kubenswrapper[4906]: I0310 00:36:02.873368 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8n9c\" (UniqueName: \"kubernetes.io/projected/bc2480c1-8758-4811-b5a0-f4dcc7884a39-kube-api-access-v8n9c\") pod \"must-gather-sgtxk\" (UID: \"bc2480c1-8758-4811-b5a0-f4dcc7884a39\") " pod="openshift-must-gather-ffwsk/must-gather-sgtxk" Mar 10 00:36:02 crc kubenswrapper[4906]: I0310 00:36:02.873529 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bc2480c1-8758-4811-b5a0-f4dcc7884a39-must-gather-output\") pod \"must-gather-sgtxk\" (UID: \"bc2480c1-8758-4811-b5a0-f4dcc7884a39\") " pod="openshift-must-gather-ffwsk/must-gather-sgtxk" Mar 10 00:36:02 crc kubenswrapper[4906]: I0310 00:36:02.889407 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551716-kmxb7" event={"ID":"175ba419-d912-49ca-8706-4e6b5ca2eeca","Type":"ContainerStarted","Data":"5f9dad0db40c54cdfa6b43c1a027254b119bc375ab4ae5ee9c70f614c8eef379"} Mar 10 00:36:02 crc kubenswrapper[4906]: I0310 00:36:02.897521 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8n9c\" (UniqueName: \"kubernetes.io/projected/bc2480c1-8758-4811-b5a0-f4dcc7884a39-kube-api-access-v8n9c\") pod \"must-gather-sgtxk\" (UID: \"bc2480c1-8758-4811-b5a0-f4dcc7884a39\") " pod="openshift-must-gather-ffwsk/must-gather-sgtxk" Mar 10 00:36:02 crc kubenswrapper[4906]: I0310 00:36:02.972907 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ffwsk/must-gather-sgtxk" Mar 10 00:36:03 crc kubenswrapper[4906]: I0310 00:36:03.213412 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551716-kmxb7" podStartSLOduration=1.86709817 podStartE2EDuration="3.213387579s" podCreationTimestamp="2026-03-10 00:36:00 +0000 UTC" firstStartedPulling="2026-03-10 00:36:00.919821144 +0000 UTC m=+1787.067716256" lastFinishedPulling="2026-03-10 00:36:02.266110513 +0000 UTC m=+1788.414005665" observedRunningTime="2026-03-10 00:36:02.910593341 +0000 UTC m=+1789.058488453" watchObservedRunningTime="2026-03-10 00:36:03.213387579 +0000 UTC m=+1789.361282701" Mar 10 00:36:03 crc kubenswrapper[4906]: I0310 00:36:03.217040 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ffwsk/must-gather-sgtxk"] Mar 10 00:36:03 crc kubenswrapper[4906]: I0310 00:36:03.902615 4906 generic.go:334] "Generic (PLEG): container finished" podID="175ba419-d912-49ca-8706-4e6b5ca2eeca" containerID="5f9dad0db40c54cdfa6b43c1a027254b119bc375ab4ae5ee9c70f614c8eef379" exitCode=0 Mar 10 00:36:03 crc kubenswrapper[4906]: I0310 00:36:03.902751 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551716-kmxb7" event={"ID":"175ba419-d912-49ca-8706-4e6b5ca2eeca","Type":"ContainerDied","Data":"5f9dad0db40c54cdfa6b43c1a027254b119bc375ab4ae5ee9c70f614c8eef379"} Mar 10 00:36:03 crc kubenswrapper[4906]: I0310 00:36:03.905412 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ffwsk/must-gather-sgtxk" event={"ID":"bc2480c1-8758-4811-b5a0-f4dcc7884a39","Type":"ContainerStarted","Data":"b0b328f45a6be060d42f6f4f2037580db234e9c01564557dbe32bd396a9237a9"} Mar 10 00:36:05 crc kubenswrapper[4906]: I0310 00:36:05.311879 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551716-kmxb7" Mar 10 00:36:05 crc kubenswrapper[4906]: I0310 00:36:05.426862 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q5v8\" (UniqueName: \"kubernetes.io/projected/175ba419-d912-49ca-8706-4e6b5ca2eeca-kube-api-access-8q5v8\") pod \"175ba419-d912-49ca-8706-4e6b5ca2eeca\" (UID: \"175ba419-d912-49ca-8706-4e6b5ca2eeca\") " Mar 10 00:36:05 crc kubenswrapper[4906]: I0310 00:36:05.436936 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/175ba419-d912-49ca-8706-4e6b5ca2eeca-kube-api-access-8q5v8" (OuterVolumeSpecName: "kube-api-access-8q5v8") pod "175ba419-d912-49ca-8706-4e6b5ca2eeca" (UID: "175ba419-d912-49ca-8706-4e6b5ca2eeca"). InnerVolumeSpecName "kube-api-access-8q5v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:36:05 crc kubenswrapper[4906]: I0310 00:36:05.529138 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q5v8\" (UniqueName: \"kubernetes.io/projected/175ba419-d912-49ca-8706-4e6b5ca2eeca-kube-api-access-8q5v8\") on node \"crc\" DevicePath \"\"" Mar 10 00:36:05 crc kubenswrapper[4906]: I0310 00:36:05.930518 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551716-kmxb7" event={"ID":"175ba419-d912-49ca-8706-4e6b5ca2eeca","Type":"ContainerDied","Data":"5429c7bac1c96c688f5edf640a2639cff2d7ccef51af8877ccbf26af571e4935"} Mar 10 00:36:05 crc kubenswrapper[4906]: I0310 00:36:05.930567 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5429c7bac1c96c688f5edf640a2639cff2d7ccef51af8877ccbf26af571e4935" Mar 10 00:36:05 crc kubenswrapper[4906]: I0310 00:36:05.930654 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551716-kmxb7" Mar 10 00:36:05 crc kubenswrapper[4906]: I0310 00:36:05.968783 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551710-sgxxm"] Mar 10 00:36:05 crc kubenswrapper[4906]: I0310 00:36:05.974088 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551710-sgxxm"] Mar 10 00:36:06 crc kubenswrapper[4906]: I0310 00:36:06.585689 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b331f320-249a-4c86-b3ec-89d0bf6ab0d0" path="/var/lib/kubelet/pods/b331f320-249a-4c86-b3ec-89d0bf6ab0d0/volumes" Mar 10 00:36:09 crc kubenswrapper[4906]: I0310 00:36:09.960680 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ffwsk/must-gather-sgtxk" event={"ID":"bc2480c1-8758-4811-b5a0-f4dcc7884a39","Type":"ContainerStarted","Data":"99d300d896686940c8a67e9871ac0e52cd15e685a8375458705fdb5290204c8b"} Mar 10 00:36:09 crc kubenswrapper[4906]: I0310 00:36:09.961292 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ffwsk/must-gather-sgtxk" event={"ID":"bc2480c1-8758-4811-b5a0-f4dcc7884a39","Type":"ContainerStarted","Data":"02e13ca2234939a4e9cb4b8da2eaada285ca342f200ef8919c89f0f809831a20"} Mar 10 00:36:09 crc kubenswrapper[4906]: I0310 00:36:09.980178 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ffwsk/must-gather-sgtxk" podStartSLOduration=1.988707767 podStartE2EDuration="7.980156396s" podCreationTimestamp="2026-03-10 00:36:02 +0000 UTC" firstStartedPulling="2026-03-10 00:36:03.245007868 +0000 UTC m=+1789.392902980" lastFinishedPulling="2026-03-10 00:36:09.236456497 +0000 UTC m=+1795.384351609" observedRunningTime="2026-03-10 00:36:09.972182742 +0000 UTC m=+1796.120077854" watchObservedRunningTime="2026-03-10 00:36:09.980156396 +0000 UTC m=+1796.128051508" Mar 10 00:36:16 crc kubenswrapper[4906]: I0310 00:36:16.576414 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:36:16 crc kubenswrapper[4906]: E0310 00:36:16.577075 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:36:27 crc kubenswrapper[4906]: I0310 00:36:27.576805 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:36:27 crc kubenswrapper[4906]: E0310 00:36:27.577507 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:36:28 crc kubenswrapper[4906]: I0310 00:36:28.964884 4906 scope.go:117] "RemoveContainer" containerID="41236f41b325fe004bc18cc590365d958c7b0b793d4ab9dadfd6d45082642cfa" Mar 10 00:36:29 crc kubenswrapper[4906]: I0310 00:36:29.000164 4906 scope.go:117] "RemoveContainer" containerID="c228b9f5968a296b723ea7a7472f74d59f19ed119f2e9fcdf135bdf95bc80716" Mar 10 00:36:29 crc kubenswrapper[4906]: I0310 00:36:29.041809 4906 scope.go:117] "RemoveContainer" containerID="c2765d137969e172edca3da8241f8c85f1f41730aa9e2b06f5ebc6ab6d39c687" Mar 10 00:36:29 crc kubenswrapper[4906]: I0310 00:36:29.071898 4906 scope.go:117] "RemoveContainer" containerID="2c628760c387b1aafef06f0a246a2e5b5d795c623756dfa5334b3ca000207817" Mar 10 00:36:29 crc kubenswrapper[4906]: I0310 00:36:29.103839 4906 scope.go:117] "RemoveContainer" containerID="80f9436bc15a06cf11da9ef27ffcbf8edfb89546604e89737143d24c8f3d8f63" Mar 10 00:36:40 crc kubenswrapper[4906]: I0310 00:36:40.576730 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:36:40 crc kubenswrapper[4906]: E0310 00:36:40.577348 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:36:49 crc kubenswrapper[4906]: I0310 00:36:49.390534 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xd62p_66a5f13c-92bf-4e0b-ac5e-f14de7c6f72a/control-plane-machine-set-operator/0.log" Mar 10 00:36:49 crc kubenswrapper[4906]: I0310 00:36:49.542208 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lrj5w_e2a24c21-ab84-4989-a154-b8c9118c31bf/machine-api-operator/0.log" Mar 10 00:36:49 crc kubenswrapper[4906]: I0310 00:36:49.548267 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lrj5w_e2a24c21-ab84-4989-a154-b8c9118c31bf/kube-rbac-proxy/0.log" Mar 10 00:36:53 crc kubenswrapper[4906]: I0310 00:36:53.576131 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:36:53 crc kubenswrapper[4906]: E0310 00:36:53.576740 4906 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bxtw4_openshift-machine-config-operator(72d61d35-0a64-45a5-8df3-9c429727deba)\"" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" Mar 10 00:36:55 crc kubenswrapper[4906]: I0310 00:36:55.560097 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-4qqsh"] Mar 10 00:36:55 crc kubenswrapper[4906]: E0310 00:36:55.561473 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="175ba419-d912-49ca-8706-4e6b5ca2eeca" containerName="oc" Mar 10 00:36:55 crc kubenswrapper[4906]: I0310 00:36:55.561577 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="175ba419-d912-49ca-8706-4e6b5ca2eeca" containerName="oc" Mar 10 00:36:55 crc kubenswrapper[4906]: I0310 00:36:55.561860 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="175ba419-d912-49ca-8706-4e6b5ca2eeca" containerName="oc" Mar 10 00:36:55 crc kubenswrapper[4906]: I0310 00:36:55.562728 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-4qqsh" Mar 10 00:36:55 crc kubenswrapper[4906]: I0310 00:36:55.569323 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-4qqsh"] Mar 10 00:36:55 crc kubenswrapper[4906]: I0310 00:36:55.707129 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtzvd\" (UniqueName: \"kubernetes.io/projected/2787021f-985b-4d5c-ac05-7bd0c714daae-kube-api-access-wtzvd\") pod \"infrawatch-operators-4qqsh\" (UID: \"2787021f-985b-4d5c-ac05-7bd0c714daae\") " pod="service-telemetry/infrawatch-operators-4qqsh" Mar 10 00:36:55 crc kubenswrapper[4906]: I0310 00:36:55.808594 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtzvd\" (UniqueName: \"kubernetes.io/projected/2787021f-985b-4d5c-ac05-7bd0c714daae-kube-api-access-wtzvd\") pod \"infrawatch-operators-4qqsh\" (UID: \"2787021f-985b-4d5c-ac05-7bd0c714daae\") " pod="service-telemetry/infrawatch-operators-4qqsh" Mar 10 00:36:55 crc kubenswrapper[4906]: I0310 00:36:55.827342 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtzvd\" (UniqueName: \"kubernetes.io/projected/2787021f-985b-4d5c-ac05-7bd0c714daae-kube-api-access-wtzvd\") pod \"infrawatch-operators-4qqsh\" (UID: \"2787021f-985b-4d5c-ac05-7bd0c714daae\") " pod="service-telemetry/infrawatch-operators-4qqsh" Mar 10 00:36:55 crc kubenswrapper[4906]: I0310 00:36:55.889258 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-4qqsh" Mar 10 00:36:56 crc kubenswrapper[4906]: I0310 00:36:56.306185 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-4qqsh"] Mar 10 00:36:57 crc kubenswrapper[4906]: I0310 00:36:57.320582 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-4qqsh" event={"ID":"2787021f-985b-4d5c-ac05-7bd0c714daae","Type":"ContainerStarted","Data":"56f53f493234d59aaa0d6624ec8c0d28b975578c09ceef9bdb09e968a43cdb32"} Mar 10 00:36:57 crc kubenswrapper[4906]: I0310 00:36:57.320848 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-4qqsh" event={"ID":"2787021f-985b-4d5c-ac05-7bd0c714daae","Type":"ContainerStarted","Data":"aa2c414e934f93b419866fd7c22deb5fc5156a819efa6f310c17f9b5f96d7448"} Mar 10 00:36:57 crc kubenswrapper[4906]: I0310 00:36:57.336506 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-4qqsh" podStartSLOduration=2.2012031260000002 podStartE2EDuration="2.336485595s" podCreationTimestamp="2026-03-10 00:36:55 +0000 UTC" firstStartedPulling="2026-03-10 00:36:56.323388634 +0000 UTC m=+1842.471283776" lastFinishedPulling="2026-03-10 00:36:56.458671133 +0000 UTC m=+1842.606566245" observedRunningTime="2026-03-10 00:36:57.33593622 +0000 UTC m=+1843.483831352" watchObservedRunningTime="2026-03-10 00:36:57.336485595 +0000 UTC m=+1843.484380727" Mar 10 00:37:00 crc kubenswrapper[4906]: I0310 00:37:00.758461 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-kbpfz_e31ca248-2d64-4a9b-82d0-e374779ccb46/cert-manager-controller/0.log" Mar 10 00:37:00 crc kubenswrapper[4906]: I0310 00:37:00.853377 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-tlmd5_dc844683-db0c-4dde-8600-17b00f2d66bd/cert-manager-cainjector/0.log" Mar 10 00:37:00 crc kubenswrapper[4906]: I0310 00:37:00.951007 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-kbnf9_30781eae-a8cc-4149-bee1-43feb29845ba/cert-manager-webhook/0.log" Mar 10 00:37:04 crc kubenswrapper[4906]: I0310 00:37:04.591913 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:37:05 crc kubenswrapper[4906]: I0310 00:37:05.381925 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerStarted","Data":"beb22ffdab30b1e8caaf3c4130c31ace21314768233d626cae4396452b5de532"} Mar 10 00:37:05 crc kubenswrapper[4906]: I0310 00:37:05.889477 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-4qqsh" Mar 10 00:37:05 crc kubenswrapper[4906]: I0310 00:37:05.889791 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-4qqsh" Mar 10 00:37:05 crc kubenswrapper[4906]: I0310 00:37:05.924891 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-4qqsh" Mar 10 00:37:06 crc kubenswrapper[4906]: I0310 00:37:06.434059 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-4qqsh" Mar 10 00:37:06 crc kubenswrapper[4906]: I0310 00:37:06.476557 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-4qqsh"] Mar 10 00:37:08 crc kubenswrapper[4906]: I0310 00:37:08.402029 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-4qqsh" podUID="2787021f-985b-4d5c-ac05-7bd0c714daae" containerName="registry-server" containerID="cri-o://56f53f493234d59aaa0d6624ec8c0d28b975578c09ceef9bdb09e968a43cdb32" gracePeriod=2 Mar 10 00:37:08 crc kubenswrapper[4906]: I0310 00:37:08.788080 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-4qqsh" Mar 10 00:37:08 crc kubenswrapper[4906]: I0310 00:37:08.919036 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtzvd\" (UniqueName: \"kubernetes.io/projected/2787021f-985b-4d5c-ac05-7bd0c714daae-kube-api-access-wtzvd\") pod \"2787021f-985b-4d5c-ac05-7bd0c714daae\" (UID: \"2787021f-985b-4d5c-ac05-7bd0c714daae\") " Mar 10 00:37:08 crc kubenswrapper[4906]: I0310 00:37:08.924885 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2787021f-985b-4d5c-ac05-7bd0c714daae-kube-api-access-wtzvd" (OuterVolumeSpecName: "kube-api-access-wtzvd") pod "2787021f-985b-4d5c-ac05-7bd0c714daae" (UID: "2787021f-985b-4d5c-ac05-7bd0c714daae"). InnerVolumeSpecName "kube-api-access-wtzvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:37:09 crc kubenswrapper[4906]: I0310 00:37:09.021069 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtzvd\" (UniqueName: \"kubernetes.io/projected/2787021f-985b-4d5c-ac05-7bd0c714daae-kube-api-access-wtzvd\") on node \"crc\" DevicePath \"\"" Mar 10 00:37:09 crc kubenswrapper[4906]: I0310 00:37:09.411706 4906 generic.go:334] "Generic (PLEG): container finished" podID="2787021f-985b-4d5c-ac05-7bd0c714daae" containerID="56f53f493234d59aaa0d6624ec8c0d28b975578c09ceef9bdb09e968a43cdb32" exitCode=0 Mar 10 00:37:09 crc kubenswrapper[4906]: I0310 00:37:09.411752 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-4qqsh" event={"ID":"2787021f-985b-4d5c-ac05-7bd0c714daae","Type":"ContainerDied","Data":"56f53f493234d59aaa0d6624ec8c0d28b975578c09ceef9bdb09e968a43cdb32"} Mar 10 00:37:09 crc kubenswrapper[4906]: I0310 00:37:09.411760 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-4qqsh" Mar 10 00:37:09 crc kubenswrapper[4906]: I0310 00:37:09.411786 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-4qqsh" event={"ID":"2787021f-985b-4d5c-ac05-7bd0c714daae","Type":"ContainerDied","Data":"aa2c414e934f93b419866fd7c22deb5fc5156a819efa6f310c17f9b5f96d7448"} Mar 10 00:37:09 crc kubenswrapper[4906]: I0310 00:37:09.411810 4906 scope.go:117] "RemoveContainer" containerID="56f53f493234d59aaa0d6624ec8c0d28b975578c09ceef9bdb09e968a43cdb32" Mar 10 00:37:09 crc kubenswrapper[4906]: I0310 00:37:09.438628 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-4qqsh"] Mar 10 00:37:09 crc kubenswrapper[4906]: I0310 00:37:09.444604 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-4qqsh"] Mar 10 00:37:09 crc kubenswrapper[4906]: I0310 00:37:09.446083 4906 scope.go:117] "RemoveContainer" containerID="56f53f493234d59aaa0d6624ec8c0d28b975578c09ceef9bdb09e968a43cdb32" Mar 10 00:37:09 crc kubenswrapper[4906]: E0310 00:37:09.446626 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56f53f493234d59aaa0d6624ec8c0d28b975578c09ceef9bdb09e968a43cdb32\": container with ID starting with 56f53f493234d59aaa0d6624ec8c0d28b975578c09ceef9bdb09e968a43cdb32 not found: ID does not exist" containerID="56f53f493234d59aaa0d6624ec8c0d28b975578c09ceef9bdb09e968a43cdb32" Mar 10 00:37:09 crc kubenswrapper[4906]: I0310 00:37:09.446674 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56f53f493234d59aaa0d6624ec8c0d28b975578c09ceef9bdb09e968a43cdb32"} err="failed to get container status \"56f53f493234d59aaa0d6624ec8c0d28b975578c09ceef9bdb09e968a43cdb32\": rpc error: code = NotFound desc = could not find container \"56f53f493234d59aaa0d6624ec8c0d28b975578c09ceef9bdb09e968a43cdb32\": container with ID starting with 56f53f493234d59aaa0d6624ec8c0d28b975578c09ceef9bdb09e968a43cdb32 not found: ID does not exist" Mar 10 00:37:10 crc kubenswrapper[4906]: I0310 00:37:10.589424 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2787021f-985b-4d5c-ac05-7bd0c714daae" path="/var/lib/kubelet/pods/2787021f-985b-4d5c-ac05-7bd0c714daae/volumes" Mar 10 00:37:14 crc kubenswrapper[4906]: I0310 00:37:14.037513 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-qm858_a13dbb4b-4d05-4b5a-9155-863ae39b84d7/prometheus-operator/0.log" Mar 10 00:37:14 crc kubenswrapper[4906]: I0310 00:37:14.211404 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr_6604103a-2326-470c-b01f-a99fea58b571/prometheus-operator-admission-webhook/0.log" Mar 10 00:37:14 crc kubenswrapper[4906]: I0310 00:37:14.265602 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd_6f2dab40-1946-479a-9741-637188d9fe10/prometheus-operator-admission-webhook/0.log" Mar 10 00:37:14 crc kubenswrapper[4906]: I0310 00:37:14.378134 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-6w9dp_ba8e917b-ba75-43ca-ba9e-d52c72031402/operator/0.log" Mar 10 00:37:14 crc kubenswrapper[4906]: I0310 00:37:14.444462 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-4gjdk_05f0097c-957a-4679-bfde-f6fd3bb64a31/perses-operator/0.log" Mar 10 00:37:28 crc kubenswrapper[4906]: I0310 00:37:28.733566 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c_98165996-cb0a-4820-b1b0-e014d3362b2d/util/0.log" Mar 10 00:37:28 crc kubenswrapper[4906]: I0310 00:37:28.921601 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c_98165996-cb0a-4820-b1b0-e014d3362b2d/pull/0.log" Mar 10 00:37:28 crc kubenswrapper[4906]: I0310 00:37:28.950814 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c_98165996-cb0a-4820-b1b0-e014d3362b2d/pull/0.log" Mar 10 00:37:28 crc kubenswrapper[4906]: I0310 00:37:28.970967 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c_98165996-cb0a-4820-b1b0-e014d3362b2d/util/0.log" Mar 10 00:37:29 crc kubenswrapper[4906]: I0310 00:37:29.093473 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c_98165996-cb0a-4820-b1b0-e014d3362b2d/util/0.log" Mar 10 00:37:29 crc kubenswrapper[4906]: I0310 00:37:29.094450 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c_98165996-cb0a-4820-b1b0-e014d3362b2d/pull/0.log" Mar 10 00:37:29 crc kubenswrapper[4906]: I0310 00:37:29.125562 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxrb6c_98165996-cb0a-4820-b1b0-e014d3362b2d/extract/0.log" Mar 10 00:37:29 crc kubenswrapper[4906]: I0310 00:37:29.310436 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7_9f632493-caa4-489a-8960-e6980ffd1659/util/0.log" Mar 10 00:37:29 crc kubenswrapper[4906]: I0310 00:37:29.435262 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7_9f632493-caa4-489a-8960-e6980ffd1659/util/0.log" Mar 10 00:37:29 crc kubenswrapper[4906]: I0310 00:37:29.444426 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7_9f632493-caa4-489a-8960-e6980ffd1659/pull/0.log" Mar 10 00:37:29 crc kubenswrapper[4906]: I0310 00:37:29.461184 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7_9f632493-caa4-489a-8960-e6980ffd1659/pull/0.log" Mar 10 00:37:29 crc kubenswrapper[4906]: I0310 00:37:29.616278 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7_9f632493-caa4-489a-8960-e6980ffd1659/extract/0.log" Mar 10 00:37:29 crc kubenswrapper[4906]: I0310 00:37:29.617605 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7_9f632493-caa4-489a-8960-e6980ffd1659/util/0.log" Mar 10 00:37:29 crc kubenswrapper[4906]: I0310 00:37:29.618958 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39elzlz7_9f632493-caa4-489a-8960-e6980ffd1659/pull/0.log" Mar 10 00:37:29 crc kubenswrapper[4906]: I0310 00:37:29.764016 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq_79926529-086e-4612-ac41-ad0e16ac2a4d/util/0.log" Mar 10 00:37:29 crc kubenswrapper[4906]: I0310 00:37:29.898941 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq_79926529-086e-4612-ac41-ad0e16ac2a4d/util/0.log" Mar 10 00:37:29 crc kubenswrapper[4906]: I0310 00:37:29.957289 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq_79926529-086e-4612-ac41-ad0e16ac2a4d/pull/0.log" Mar 10 00:37:29 crc kubenswrapper[4906]: I0310 00:37:29.982822 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq_79926529-086e-4612-ac41-ad0e16ac2a4d/pull/0.log" Mar 10 00:37:30 crc kubenswrapper[4906]: I0310 00:37:30.153950 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq_79926529-086e-4612-ac41-ad0e16ac2a4d/extract/0.log" Mar 10 00:37:30 crc kubenswrapper[4906]: I0310 00:37:30.167696 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq_79926529-086e-4612-ac41-ad0e16ac2a4d/pull/0.log" Mar 10 00:37:30 crc kubenswrapper[4906]: I0310 00:37:30.180465 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gchrq_79926529-086e-4612-ac41-ad0e16ac2a4d/util/0.log" Mar 10 00:37:30 crc kubenswrapper[4906]: I0310 00:37:30.298934 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9_a7ead5da-0c4e-4774-b55c-fcacd2943780/util/0.log" Mar 10 00:37:30 crc kubenswrapper[4906]: I0310 00:37:30.485689 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9_a7ead5da-0c4e-4774-b55c-fcacd2943780/util/0.log" Mar 10 00:37:30 crc kubenswrapper[4906]: I0310 00:37:30.498149 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9_a7ead5da-0c4e-4774-b55c-fcacd2943780/pull/0.log" Mar 10 00:37:30 crc kubenswrapper[4906]: I0310 00:37:30.530609 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9_a7ead5da-0c4e-4774-b55c-fcacd2943780/pull/0.log" Mar 10 00:37:30 crc kubenswrapper[4906]: I0310 00:37:30.688993 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9_a7ead5da-0c4e-4774-b55c-fcacd2943780/util/0.log" Mar 10 00:37:30 crc kubenswrapper[4906]: I0310 00:37:30.715896 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9_a7ead5da-0c4e-4774-b55c-fcacd2943780/pull/0.log" Mar 10 00:37:30 crc kubenswrapper[4906]: I0310 00:37:30.737745 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08cv5f9_a7ead5da-0c4e-4774-b55c-fcacd2943780/extract/0.log" Mar 10 00:37:30 crc kubenswrapper[4906]: I0310 00:37:30.873138 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2zhb6_2b962aff-3926-4d40-b95f-ea1c8062ede2/extract-utilities/0.log" Mar 10 00:37:31 crc kubenswrapper[4906]: I0310 00:37:31.010316 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2zhb6_2b962aff-3926-4d40-b95f-ea1c8062ede2/extract-content/0.log" Mar 10 00:37:31 crc kubenswrapper[4906]: I0310 00:37:31.014276 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2zhb6_2b962aff-3926-4d40-b95f-ea1c8062ede2/extract-utilities/0.log" Mar 10 00:37:31 crc kubenswrapper[4906]: I0310 00:37:31.046663 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2zhb6_2b962aff-3926-4d40-b95f-ea1c8062ede2/extract-content/0.log" Mar 10 00:37:31 crc kubenswrapper[4906]: I0310 00:37:31.231039 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2zhb6_2b962aff-3926-4d40-b95f-ea1c8062ede2/extract-utilities/0.log" Mar 10 00:37:31 crc kubenswrapper[4906]: I0310 00:37:31.250450 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2zhb6_2b962aff-3926-4d40-b95f-ea1c8062ede2/extract-content/0.log" Mar 10 00:37:31 crc kubenswrapper[4906]: I0310 00:37:31.447922 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9x7g5_ac5a82c2-3734-4064-bb05-2cf40dededee/extract-utilities/0.log" Mar 10 00:37:31 crc kubenswrapper[4906]: I0310 00:37:31.530648 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2zhb6_2b962aff-3926-4d40-b95f-ea1c8062ede2/registry-server/0.log" Mar 10 00:37:31 crc kubenswrapper[4906]: I0310 00:37:31.638731 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9x7g5_ac5a82c2-3734-4064-bb05-2cf40dededee/extract-utilities/0.log" Mar 10 00:37:31 crc kubenswrapper[4906]: I0310 00:37:31.638744 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9x7g5_ac5a82c2-3734-4064-bb05-2cf40dededee/extract-content/0.log" Mar 10 00:37:31 crc kubenswrapper[4906]: I0310 00:37:31.677712 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9x7g5_ac5a82c2-3734-4064-bb05-2cf40dededee/extract-content/0.log" Mar 10 00:37:31 crc kubenswrapper[4906]: I0310 00:37:31.826178 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9x7g5_ac5a82c2-3734-4064-bb05-2cf40dededee/extract-utilities/0.log" Mar 10 00:37:31 crc kubenswrapper[4906]: I0310 00:37:31.830402 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9x7g5_ac5a82c2-3734-4064-bb05-2cf40dededee/extract-content/0.log" Mar 10 00:37:32 crc kubenswrapper[4906]: I0310 00:37:32.050379 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-bdv5q_2bafcdb0-094e-4426-96b0-c23d59d49da2/marketplace-operator/0.log" Mar 10 00:37:32 crc kubenswrapper[4906]: I0310 00:37:32.133474 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zr8fw_0c6467f0-f671-4388-9851-05416de6f4b1/extract-utilities/0.log" Mar 10 00:37:32 crc kubenswrapper[4906]: I0310 00:37:32.173734 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9x7g5_ac5a82c2-3734-4064-bb05-2cf40dededee/registry-server/0.log" Mar 10 00:37:32 crc kubenswrapper[4906]: I0310 00:37:32.352719 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zr8fw_0c6467f0-f671-4388-9851-05416de6f4b1/extract-utilities/0.log" Mar 10 00:37:32 crc kubenswrapper[4906]: I0310 00:37:32.362489 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zr8fw_0c6467f0-f671-4388-9851-05416de6f4b1/extract-content/0.log" Mar 10 00:37:32 crc kubenswrapper[4906]: I0310 00:37:32.380728 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zr8fw_0c6467f0-f671-4388-9851-05416de6f4b1/extract-content/0.log" Mar 10 00:37:32 crc kubenswrapper[4906]: I0310 00:37:32.516230 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zr8fw_0c6467f0-f671-4388-9851-05416de6f4b1/extract-utilities/0.log" Mar 10 00:37:32 crc kubenswrapper[4906]: I0310 00:37:32.517581 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zr8fw_0c6467f0-f671-4388-9851-05416de6f4b1/extract-content/0.log" Mar 10 00:37:32 crc kubenswrapper[4906]: I0310 00:37:32.792522 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zr8fw_0c6467f0-f671-4388-9851-05416de6f4b1/registry-server/0.log" Mar 10 00:37:45 crc kubenswrapper[4906]: I0310 00:37:45.803010 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-qm858_a13dbb4b-4d05-4b5a-9155-863ae39b84d7/prometheus-operator/0.log" Mar 10 00:37:45 crc kubenswrapper[4906]: I0310 00:37:45.848885 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6878d69d47-mbmcr_6604103a-2326-470c-b01f-a99fea58b571/prometheus-operator-admission-webhook/0.log" Mar 10 00:37:45 crc kubenswrapper[4906]: I0310 00:37:45.883889 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6878d69d47-mlwmd_6f2dab40-1946-479a-9741-637188d9fe10/prometheus-operator-admission-webhook/0.log" Mar 10 00:37:45 crc kubenswrapper[4906]: I0310 00:37:45.985334 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-6w9dp_ba8e917b-ba75-43ca-ba9e-d52c72031402/operator/0.log" Mar 10 00:37:46 crc kubenswrapper[4906]: I0310 00:37:46.024121 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-4gjdk_05f0097c-957a-4679-bfde-f6fd3bb64a31/perses-operator/0.log" Mar 10 00:38:00 crc kubenswrapper[4906]: I0310 00:38:00.145553 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551718-wlgwq"] Mar 10 00:38:00 crc kubenswrapper[4906]: E0310 00:38:00.146418 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2787021f-985b-4d5c-ac05-7bd0c714daae" containerName="registry-server" Mar 10 00:38:00 crc kubenswrapper[4906]: I0310 00:38:00.146434 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="2787021f-985b-4d5c-ac05-7bd0c714daae" containerName="registry-server" Mar 10 00:38:00 crc kubenswrapper[4906]: I0310 00:38:00.146595 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="2787021f-985b-4d5c-ac05-7bd0c714daae" containerName="registry-server" Mar 10 00:38:00 crc kubenswrapper[4906]: I0310 00:38:00.147158 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551718-wlgwq" Mar 10 00:38:00 crc kubenswrapper[4906]: I0310 00:38:00.148991 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:38:00 crc kubenswrapper[4906]: I0310 00:38:00.149324 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:38:00 crc kubenswrapper[4906]: I0310 00:38:00.155028 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ktdr6" Mar 10 00:38:00 crc kubenswrapper[4906]: I0310 00:38:00.159730 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551718-wlgwq"] Mar 10 00:38:00 crc kubenswrapper[4906]: I0310 00:38:00.290011 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwnkz\" (UniqueName: \"kubernetes.io/projected/86a1f117-5ccb-4239-9dcd-39d90ea0d0ea-kube-api-access-cwnkz\") pod \"auto-csr-approver-29551718-wlgwq\" (UID: \"86a1f117-5ccb-4239-9dcd-39d90ea0d0ea\") " pod="openshift-infra/auto-csr-approver-29551718-wlgwq" Mar 10 00:38:00 crc kubenswrapper[4906]: I0310 00:38:00.391655 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwnkz\" (UniqueName: \"kubernetes.io/projected/86a1f117-5ccb-4239-9dcd-39d90ea0d0ea-kube-api-access-cwnkz\") pod \"auto-csr-approver-29551718-wlgwq\" (UID: \"86a1f117-5ccb-4239-9dcd-39d90ea0d0ea\") " pod="openshift-infra/auto-csr-approver-29551718-wlgwq" Mar 10 00:38:00 crc kubenswrapper[4906]: I0310 00:38:00.426749 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwnkz\" (UniqueName: \"kubernetes.io/projected/86a1f117-5ccb-4239-9dcd-39d90ea0d0ea-kube-api-access-cwnkz\") pod \"auto-csr-approver-29551718-wlgwq\" (UID: \"86a1f117-5ccb-4239-9dcd-39d90ea0d0ea\") " pod="openshift-infra/auto-csr-approver-29551718-wlgwq" Mar 10 00:38:00 crc kubenswrapper[4906]: I0310 00:38:00.512313 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551718-wlgwq" Mar 10 00:38:00 crc kubenswrapper[4906]: I0310 00:38:00.729805 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551718-wlgwq"] Mar 10 00:38:00 crc kubenswrapper[4906]: W0310 00:38:00.745001 4906 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86a1f117_5ccb_4239_9dcd_39d90ea0d0ea.slice/crio-2f56ea0fd79c760fb0b6a54b38f1ba05b5a8dc910158ffe33feaf8d9e5e4f5e1 WatchSource:0}: Error finding container 2f56ea0fd79c760fb0b6a54b38f1ba05b5a8dc910158ffe33feaf8d9e5e4f5e1: Status 404 returned error can't find the container with id 2f56ea0fd79c760fb0b6a54b38f1ba05b5a8dc910158ffe33feaf8d9e5e4f5e1 Mar 10 00:38:00 crc kubenswrapper[4906]: I0310 00:38:00.843830 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551718-wlgwq" event={"ID":"86a1f117-5ccb-4239-9dcd-39d90ea0d0ea","Type":"ContainerStarted","Data":"2f56ea0fd79c760fb0b6a54b38f1ba05b5a8dc910158ffe33feaf8d9e5e4f5e1"} Mar 10 00:38:02 crc kubenswrapper[4906]: E0310 00:38:02.524526 4906 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86a1f117_5ccb_4239_9dcd_39d90ea0d0ea.slice/crio-ec390883330f8cf64f586ba0b4512b2a5b11302e0d97daaace27173ebb3d1019.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86a1f117_5ccb_4239_9dcd_39d90ea0d0ea.slice/crio-conmon-ec390883330f8cf64f586ba0b4512b2a5b11302e0d97daaace27173ebb3d1019.scope\": RecentStats: unable to find data in memory cache]" Mar 10 00:38:02 crc kubenswrapper[4906]: I0310 00:38:02.875080 4906 generic.go:334] "Generic (PLEG): container finished" podID="86a1f117-5ccb-4239-9dcd-39d90ea0d0ea" containerID="ec390883330f8cf64f586ba0b4512b2a5b11302e0d97daaace27173ebb3d1019" exitCode=0 Mar 10 00:38:02 crc kubenswrapper[4906]: I0310 00:38:02.875216 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551718-wlgwq" event={"ID":"86a1f117-5ccb-4239-9dcd-39d90ea0d0ea","Type":"ContainerDied","Data":"ec390883330f8cf64f586ba0b4512b2a5b11302e0d97daaace27173ebb3d1019"} Mar 10 00:38:04 crc kubenswrapper[4906]: I0310 00:38:04.165042 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551718-wlgwq" Mar 10 00:38:04 crc kubenswrapper[4906]: I0310 00:38:04.245064 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwnkz\" (UniqueName: \"kubernetes.io/projected/86a1f117-5ccb-4239-9dcd-39d90ea0d0ea-kube-api-access-cwnkz\") pod \"86a1f117-5ccb-4239-9dcd-39d90ea0d0ea\" (UID: \"86a1f117-5ccb-4239-9dcd-39d90ea0d0ea\") " Mar 10 00:38:04 crc kubenswrapper[4906]: I0310 00:38:04.259838 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86a1f117-5ccb-4239-9dcd-39d90ea0d0ea-kube-api-access-cwnkz" (OuterVolumeSpecName: "kube-api-access-cwnkz") pod "86a1f117-5ccb-4239-9dcd-39d90ea0d0ea" (UID: "86a1f117-5ccb-4239-9dcd-39d90ea0d0ea"). InnerVolumeSpecName "kube-api-access-cwnkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:38:04 crc kubenswrapper[4906]: I0310 00:38:04.347843 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwnkz\" (UniqueName: \"kubernetes.io/projected/86a1f117-5ccb-4239-9dcd-39d90ea0d0ea-kube-api-access-cwnkz\") on node \"crc\" DevicePath \"\"" Mar 10 00:38:04 crc kubenswrapper[4906]: I0310 00:38:04.898763 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551718-wlgwq" event={"ID":"86a1f117-5ccb-4239-9dcd-39d90ea0d0ea","Type":"ContainerDied","Data":"2f56ea0fd79c760fb0b6a54b38f1ba05b5a8dc910158ffe33feaf8d9e5e4f5e1"} Mar 10 00:38:04 crc kubenswrapper[4906]: I0310 00:38:04.898801 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f56ea0fd79c760fb0b6a54b38f1ba05b5a8dc910158ffe33feaf8d9e5e4f5e1" Mar 10 00:38:04 crc kubenswrapper[4906]: I0310 00:38:04.898826 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551718-wlgwq" Mar 10 00:38:05 crc kubenswrapper[4906]: I0310 00:38:05.250879 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551712-m7ff8"] Mar 10 00:38:05 crc kubenswrapper[4906]: I0310 00:38:05.262271 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551712-m7ff8"] Mar 10 00:38:06 crc kubenswrapper[4906]: I0310 00:38:06.586521 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d5bc10c-b23f-448d-90a0-539da38f2f76" path="/var/lib/kubelet/pods/3d5bc10c-b23f-448d-90a0-539da38f2f76/volumes" Mar 10 00:38:29 crc kubenswrapper[4906]: I0310 00:38:29.222244 4906 scope.go:117] "RemoveContainer" containerID="ed64661f222bc2b81849a942b7849b919a743347e94fc2eb75b83c5ac0af93a8" Mar 10 00:38:32 crc kubenswrapper[4906]: I0310 00:38:32.073501 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hrpf6"] Mar 10 00:38:32 crc kubenswrapper[4906]: E0310 00:38:32.074403 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86a1f117-5ccb-4239-9dcd-39d90ea0d0ea" containerName="oc" Mar 10 00:38:32 crc kubenswrapper[4906]: I0310 00:38:32.074436 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="86a1f117-5ccb-4239-9dcd-39d90ea0d0ea" containerName="oc" Mar 10 00:38:32 crc kubenswrapper[4906]: I0310 00:38:32.075070 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="86a1f117-5ccb-4239-9dcd-39d90ea0d0ea" containerName="oc" Mar 10 00:38:32 crc kubenswrapper[4906]: I0310 00:38:32.076327 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrpf6" Mar 10 00:38:32 crc kubenswrapper[4906]: I0310 00:38:32.085857 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hrpf6"] Mar 10 00:38:32 crc kubenswrapper[4906]: I0310 00:38:32.138224 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qzds\" (UniqueName: \"kubernetes.io/projected/a11dfde4-da18-49df-80a2-0225adb53b76-kube-api-access-8qzds\") pod \"certified-operators-hrpf6\" (UID: \"a11dfde4-da18-49df-80a2-0225adb53b76\") " pod="openshift-marketplace/certified-operators-hrpf6" Mar 10 00:38:32 crc kubenswrapper[4906]: I0310 00:38:32.138414 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a11dfde4-da18-49df-80a2-0225adb53b76-utilities\") pod \"certified-operators-hrpf6\" (UID: \"a11dfde4-da18-49df-80a2-0225adb53b76\") " pod="openshift-marketplace/certified-operators-hrpf6" Mar 10 00:38:32 crc kubenswrapper[4906]: I0310 00:38:32.138480 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a11dfde4-da18-49df-80a2-0225adb53b76-catalog-content\") pod \"certified-operators-hrpf6\" (UID: \"a11dfde4-da18-49df-80a2-0225adb53b76\") " pod="openshift-marketplace/certified-operators-hrpf6" Mar 10 00:38:32 crc kubenswrapper[4906]: I0310 00:38:32.239769 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a11dfde4-da18-49df-80a2-0225adb53b76-utilities\") pod \"certified-operators-hrpf6\" (UID: \"a11dfde4-da18-49df-80a2-0225adb53b76\") " pod="openshift-marketplace/certified-operators-hrpf6" Mar 10 00:38:32 crc kubenswrapper[4906]: I0310 00:38:32.239868 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a11dfde4-da18-49df-80a2-0225adb53b76-catalog-content\") pod \"certified-operators-hrpf6\" (UID: \"a11dfde4-da18-49df-80a2-0225adb53b76\") " pod="openshift-marketplace/certified-operators-hrpf6" Mar 10 00:38:32 crc kubenswrapper[4906]: I0310 00:38:32.239935 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qzds\" (UniqueName: \"kubernetes.io/projected/a11dfde4-da18-49df-80a2-0225adb53b76-kube-api-access-8qzds\") pod \"certified-operators-hrpf6\" (UID: \"a11dfde4-da18-49df-80a2-0225adb53b76\") " pod="openshift-marketplace/certified-operators-hrpf6" Mar 10 00:38:32 crc kubenswrapper[4906]: I0310 00:38:32.241602 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a11dfde4-da18-49df-80a2-0225adb53b76-utilities\") pod \"certified-operators-hrpf6\" (UID: \"a11dfde4-da18-49df-80a2-0225adb53b76\") " pod="openshift-marketplace/certified-operators-hrpf6" Mar 10 00:38:32 crc kubenswrapper[4906]: I0310 00:38:32.242083 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a11dfde4-da18-49df-80a2-0225adb53b76-catalog-content\") pod \"certified-operators-hrpf6\" (UID: \"a11dfde4-da18-49df-80a2-0225adb53b76\") " pod="openshift-marketplace/certified-operators-hrpf6" Mar 10 00:38:32 crc kubenswrapper[4906]: I0310 00:38:32.263758 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qzds\" (UniqueName: \"kubernetes.io/projected/a11dfde4-da18-49df-80a2-0225adb53b76-kube-api-access-8qzds\") pod \"certified-operators-hrpf6\" (UID: \"a11dfde4-da18-49df-80a2-0225adb53b76\") " pod="openshift-marketplace/certified-operators-hrpf6" Mar 10 00:38:32 crc kubenswrapper[4906]: I0310 00:38:32.408524 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrpf6" Mar 10 00:38:32 crc kubenswrapper[4906]: I0310 00:38:32.692578 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hrpf6"] Mar 10 00:38:33 crc kubenswrapper[4906]: I0310 00:38:33.233252 4906 generic.go:334] "Generic (PLEG): container finished" podID="a11dfde4-da18-49df-80a2-0225adb53b76" containerID="3357b61f1049f5f070a45efb480025c78a6e69697076c48fa112b3d92c4f766d" exitCode=0 Mar 10 00:38:33 crc kubenswrapper[4906]: I0310 00:38:33.233359 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrpf6" event={"ID":"a11dfde4-da18-49df-80a2-0225adb53b76","Type":"ContainerDied","Data":"3357b61f1049f5f070a45efb480025c78a6e69697076c48fa112b3d92c4f766d"} Mar 10 00:38:33 crc kubenswrapper[4906]: I0310 00:38:33.233564 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrpf6" event={"ID":"a11dfde4-da18-49df-80a2-0225adb53b76","Type":"ContainerStarted","Data":"e2152f112234ec77982139d1745c2be87f297c9dd58e8d607c96da20c24b109e"} Mar 10 00:38:33 crc kubenswrapper[4906]: I0310 00:38:33.235061 4906 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 00:38:34 crc kubenswrapper[4906]: I0310 00:38:34.247441 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrpf6" event={"ID":"a11dfde4-da18-49df-80a2-0225adb53b76","Type":"ContainerStarted","Data":"9981967edcbf3e568b413ab1956a957b472140c27130202043b34fed3dec0c20"} Mar 10 00:38:35 crc kubenswrapper[4906]: I0310 00:38:35.259187 4906 generic.go:334] "Generic (PLEG): container finished" podID="a11dfde4-da18-49df-80a2-0225adb53b76" containerID="9981967edcbf3e568b413ab1956a957b472140c27130202043b34fed3dec0c20" exitCode=0 Mar 10 00:38:35 crc kubenswrapper[4906]: I0310 00:38:35.259246 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrpf6" event={"ID":"a11dfde4-da18-49df-80a2-0225adb53b76","Type":"ContainerDied","Data":"9981967edcbf3e568b413ab1956a957b472140c27130202043b34fed3dec0c20"} Mar 10 00:38:36 crc kubenswrapper[4906]: I0310 00:38:36.371347 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrpf6" event={"ID":"a11dfde4-da18-49df-80a2-0225adb53b76","Type":"ContainerStarted","Data":"0c5b952c28ba84e53b938f5c456fad1d40cb4a44765d3d22209d061636e974c2"} Mar 10 00:38:36 crc kubenswrapper[4906]: I0310 00:38:36.395396 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hrpf6" podStartSLOduration=1.91993462 podStartE2EDuration="4.395376138s" podCreationTimestamp="2026-03-10 00:38:32 +0000 UTC" firstStartedPulling="2026-03-10 00:38:33.23474366 +0000 UTC m=+1939.382638792" lastFinishedPulling="2026-03-10 00:38:35.710185198 +0000 UTC m=+1941.858080310" observedRunningTime="2026-03-10 00:38:36.389361539 +0000 UTC m=+1942.537256651" watchObservedRunningTime="2026-03-10 00:38:36.395376138 +0000 UTC m=+1942.543271250" Mar 10 00:38:39 crc kubenswrapper[4906]: I0310 00:38:39.393893 4906 generic.go:334] "Generic (PLEG): container finished" podID="bc2480c1-8758-4811-b5a0-f4dcc7884a39" containerID="02e13ca2234939a4e9cb4b8da2eaada285ca342f200ef8919c89f0f809831a20" exitCode=0 Mar 10 00:38:39 crc kubenswrapper[4906]: I0310 00:38:39.393991 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ffwsk/must-gather-sgtxk" event={"ID":"bc2480c1-8758-4811-b5a0-f4dcc7884a39","Type":"ContainerDied","Data":"02e13ca2234939a4e9cb4b8da2eaada285ca342f200ef8919c89f0f809831a20"} Mar 10 00:38:39 crc kubenswrapper[4906]: I0310 00:38:39.394755 4906 scope.go:117] "RemoveContainer" containerID="02e13ca2234939a4e9cb4b8da2eaada285ca342f200ef8919c89f0f809831a20" Mar 10 00:38:39 crc kubenswrapper[4906]: I0310 00:38:39.717435 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ffwsk_must-gather-sgtxk_bc2480c1-8758-4811-b5a0-f4dcc7884a39/gather/0.log" Mar 10 00:38:42 crc kubenswrapper[4906]: I0310 00:38:42.409521 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hrpf6" Mar 10 00:38:42 crc kubenswrapper[4906]: I0310 00:38:42.409824 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hrpf6" Mar 10 00:38:42 crc kubenswrapper[4906]: I0310 00:38:42.446950 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hrpf6" Mar 10 00:38:43 crc kubenswrapper[4906]: I0310 00:38:43.477100 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hrpf6" Mar 10 00:38:43 crc kubenswrapper[4906]: I0310 00:38:43.527869 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hrpf6"] Mar 10 00:38:45 crc kubenswrapper[4906]: I0310 00:38:45.437420 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hrpf6" podUID="a11dfde4-da18-49df-80a2-0225adb53b76" containerName="registry-server" containerID="cri-o://0c5b952c28ba84e53b938f5c456fad1d40cb4a44765d3d22209d061636e974c2" gracePeriod=2 Mar 10 00:38:45 crc kubenswrapper[4906]: I0310 00:38:45.827578 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrpf6" Mar 10 00:38:45 crc kubenswrapper[4906]: I0310 00:38:45.931004 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a11dfde4-da18-49df-80a2-0225adb53b76-catalog-content\") pod \"a11dfde4-da18-49df-80a2-0225adb53b76\" (UID: \"a11dfde4-da18-49df-80a2-0225adb53b76\") " Mar 10 00:38:45 crc kubenswrapper[4906]: I0310 00:38:45.931325 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a11dfde4-da18-49df-80a2-0225adb53b76-utilities\") pod \"a11dfde4-da18-49df-80a2-0225adb53b76\" (UID: \"a11dfde4-da18-49df-80a2-0225adb53b76\") " Mar 10 00:38:45 crc kubenswrapper[4906]: I0310 00:38:45.931370 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qzds\" (UniqueName: \"kubernetes.io/projected/a11dfde4-da18-49df-80a2-0225adb53b76-kube-api-access-8qzds\") pod \"a11dfde4-da18-49df-80a2-0225adb53b76\" (UID: \"a11dfde4-da18-49df-80a2-0225adb53b76\") " Mar 10 00:38:45 crc kubenswrapper[4906]: I0310 00:38:45.932584 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a11dfde4-da18-49df-80a2-0225adb53b76-utilities" (OuterVolumeSpecName: "utilities") pod "a11dfde4-da18-49df-80a2-0225adb53b76" (UID: "a11dfde4-da18-49df-80a2-0225adb53b76"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:38:45 crc kubenswrapper[4906]: I0310 00:38:45.936866 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a11dfde4-da18-49df-80a2-0225adb53b76-kube-api-access-8qzds" (OuterVolumeSpecName: "kube-api-access-8qzds") pod "a11dfde4-da18-49df-80a2-0225adb53b76" (UID: "a11dfde4-da18-49df-80a2-0225adb53b76"). InnerVolumeSpecName "kube-api-access-8qzds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:38:45 crc kubenswrapper[4906]: I0310 00:38:45.986362 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a11dfde4-da18-49df-80a2-0225adb53b76-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a11dfde4-da18-49df-80a2-0225adb53b76" (UID: "a11dfde4-da18-49df-80a2-0225adb53b76"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.033913 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a11dfde4-da18-49df-80a2-0225adb53b76-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.033963 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a11dfde4-da18-49df-80a2-0225adb53b76-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.033989 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qzds\" (UniqueName: \"kubernetes.io/projected/a11dfde4-da18-49df-80a2-0225adb53b76-kube-api-access-8qzds\") on node \"crc\" DevicePath \"\"" Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.445509 4906 generic.go:334] "Generic (PLEG): container finished" podID="a11dfde4-da18-49df-80a2-0225adb53b76" containerID="0c5b952c28ba84e53b938f5c456fad1d40cb4a44765d3d22209d061636e974c2" exitCode=0 Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.445547 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrpf6" event={"ID":"a11dfde4-da18-49df-80a2-0225adb53b76","Type":"ContainerDied","Data":"0c5b952c28ba84e53b938f5c456fad1d40cb4a44765d3d22209d061636e974c2"} Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.445570 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrpf6" event={"ID":"a11dfde4-da18-49df-80a2-0225adb53b76","Type":"ContainerDied","Data":"e2152f112234ec77982139d1745c2be87f297c9dd58e8d607c96da20c24b109e"} Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.445587 4906 scope.go:117] "RemoveContainer" containerID="0c5b952c28ba84e53b938f5c456fad1d40cb4a44765d3d22209d061636e974c2" Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.445693 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrpf6" Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.471746 4906 scope.go:117] "RemoveContainer" containerID="9981967edcbf3e568b413ab1956a957b472140c27130202043b34fed3dec0c20" Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.479533 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hrpf6"] Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.488857 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hrpf6"] Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.500716 4906 scope.go:117] "RemoveContainer" containerID="3357b61f1049f5f070a45efb480025c78a6e69697076c48fa112b3d92c4f766d" Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.521080 4906 scope.go:117] "RemoveContainer" containerID="0c5b952c28ba84e53b938f5c456fad1d40cb4a44765d3d22209d061636e974c2" Mar 10 00:38:46 crc kubenswrapper[4906]: E0310 00:38:46.521530 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c5b952c28ba84e53b938f5c456fad1d40cb4a44765d3d22209d061636e974c2\": container with ID starting with 0c5b952c28ba84e53b938f5c456fad1d40cb4a44765d3d22209d061636e974c2 not found: ID does not exist" containerID="0c5b952c28ba84e53b938f5c456fad1d40cb4a44765d3d22209d061636e974c2" Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.521564 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c5b952c28ba84e53b938f5c456fad1d40cb4a44765d3d22209d061636e974c2"} err="failed to get container status \"0c5b952c28ba84e53b938f5c456fad1d40cb4a44765d3d22209d061636e974c2\": rpc error: code = NotFound desc = could not find container \"0c5b952c28ba84e53b938f5c456fad1d40cb4a44765d3d22209d061636e974c2\": container with ID starting with 0c5b952c28ba84e53b938f5c456fad1d40cb4a44765d3d22209d061636e974c2 not found: ID does not exist" Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.521583 4906 scope.go:117] "RemoveContainer" containerID="9981967edcbf3e568b413ab1956a957b472140c27130202043b34fed3dec0c20" Mar 10 00:38:46 crc kubenswrapper[4906]: E0310 00:38:46.521924 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9981967edcbf3e568b413ab1956a957b472140c27130202043b34fed3dec0c20\": container with ID starting with 9981967edcbf3e568b413ab1956a957b472140c27130202043b34fed3dec0c20 not found: ID does not exist" containerID="9981967edcbf3e568b413ab1956a957b472140c27130202043b34fed3dec0c20" Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.521984 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9981967edcbf3e568b413ab1956a957b472140c27130202043b34fed3dec0c20"} err="failed to get container status \"9981967edcbf3e568b413ab1956a957b472140c27130202043b34fed3dec0c20\": rpc error: code = NotFound desc = could not find container \"9981967edcbf3e568b413ab1956a957b472140c27130202043b34fed3dec0c20\": container with ID starting with 9981967edcbf3e568b413ab1956a957b472140c27130202043b34fed3dec0c20 not found: ID does not exist" Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.522024 4906 scope.go:117] "RemoveContainer" containerID="3357b61f1049f5f070a45efb480025c78a6e69697076c48fa112b3d92c4f766d" Mar 10 00:38:46 crc kubenswrapper[4906]: E0310 00:38:46.522329 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3357b61f1049f5f070a45efb480025c78a6e69697076c48fa112b3d92c4f766d\": container with ID starting with 3357b61f1049f5f070a45efb480025c78a6e69697076c48fa112b3d92c4f766d not found: ID does not exist" containerID="3357b61f1049f5f070a45efb480025c78a6e69697076c48fa112b3d92c4f766d" Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.522357 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3357b61f1049f5f070a45efb480025c78a6e69697076c48fa112b3d92c4f766d"} err="failed to get container status \"3357b61f1049f5f070a45efb480025c78a6e69697076c48fa112b3d92c4f766d\": rpc error: code = NotFound desc = could not find container \"3357b61f1049f5f070a45efb480025c78a6e69697076c48fa112b3d92c4f766d\": container with ID starting with 3357b61f1049f5f070a45efb480025c78a6e69697076c48fa112b3d92c4f766d not found: ID does not exist" Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.587579 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a11dfde4-da18-49df-80a2-0225adb53b76" path="/var/lib/kubelet/pods/a11dfde4-da18-49df-80a2-0225adb53b76/volumes" Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.935917 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ffwsk/must-gather-sgtxk"] Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.936510 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ffwsk/must-gather-sgtxk" podUID="bc2480c1-8758-4811-b5a0-f4dcc7884a39" containerName="copy" containerID="cri-o://99d300d896686940c8a67e9871ac0e52cd15e685a8375458705fdb5290204c8b" gracePeriod=2 Mar 10 00:38:46 crc kubenswrapper[4906]: I0310 00:38:46.941491 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ffwsk/must-gather-sgtxk"] Mar 10 00:38:47 crc kubenswrapper[4906]: I0310 00:38:47.407176 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ffwsk_must-gather-sgtxk_bc2480c1-8758-4811-b5a0-f4dcc7884a39/copy/0.log" Mar 10 00:38:47 crc kubenswrapper[4906]: I0310 00:38:47.407947 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ffwsk/must-gather-sgtxk" Mar 10 00:38:47 crc kubenswrapper[4906]: I0310 00:38:47.453896 4906 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ffwsk_must-gather-sgtxk_bc2480c1-8758-4811-b5a0-f4dcc7884a39/copy/0.log" Mar 10 00:38:47 crc kubenswrapper[4906]: I0310 00:38:47.454256 4906 generic.go:334] "Generic (PLEG): container finished" podID="bc2480c1-8758-4811-b5a0-f4dcc7884a39" containerID="99d300d896686940c8a67e9871ac0e52cd15e685a8375458705fdb5290204c8b" exitCode=143 Mar 10 00:38:47 crc kubenswrapper[4906]: I0310 00:38:47.454299 4906 scope.go:117] "RemoveContainer" containerID="99d300d896686940c8a67e9871ac0e52cd15e685a8375458705fdb5290204c8b" Mar 10 00:38:47 crc kubenswrapper[4906]: I0310 00:38:47.454310 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ffwsk/must-gather-sgtxk" Mar 10 00:38:47 crc kubenswrapper[4906]: I0310 00:38:47.470831 4906 scope.go:117] "RemoveContainer" containerID="02e13ca2234939a4e9cb4b8da2eaada285ca342f200ef8919c89f0f809831a20" Mar 10 00:38:47 crc kubenswrapper[4906]: I0310 00:38:47.503569 4906 scope.go:117] "RemoveContainer" containerID="99d300d896686940c8a67e9871ac0e52cd15e685a8375458705fdb5290204c8b" Mar 10 00:38:47 crc kubenswrapper[4906]: E0310 00:38:47.504553 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d300d896686940c8a67e9871ac0e52cd15e685a8375458705fdb5290204c8b\": container with ID starting with 99d300d896686940c8a67e9871ac0e52cd15e685a8375458705fdb5290204c8b not found: ID does not exist" containerID="99d300d896686940c8a67e9871ac0e52cd15e685a8375458705fdb5290204c8b" Mar 10 00:38:47 crc kubenswrapper[4906]: I0310 00:38:47.504591 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d300d896686940c8a67e9871ac0e52cd15e685a8375458705fdb5290204c8b"} err="failed to get container status \"99d300d896686940c8a67e9871ac0e52cd15e685a8375458705fdb5290204c8b\": rpc error: code = NotFound desc = could not find container \"99d300d896686940c8a67e9871ac0e52cd15e685a8375458705fdb5290204c8b\": container with ID starting with 99d300d896686940c8a67e9871ac0e52cd15e685a8375458705fdb5290204c8b not found: ID does not exist" Mar 10 00:38:47 crc kubenswrapper[4906]: I0310 00:38:47.504647 4906 scope.go:117] "RemoveContainer" containerID="02e13ca2234939a4e9cb4b8da2eaada285ca342f200ef8919c89f0f809831a20" Mar 10 00:38:47 crc kubenswrapper[4906]: E0310 00:38:47.504979 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02e13ca2234939a4e9cb4b8da2eaada285ca342f200ef8919c89f0f809831a20\": container with ID starting with 02e13ca2234939a4e9cb4b8da2eaada285ca342f200ef8919c89f0f809831a20 not found: ID does not exist" containerID="02e13ca2234939a4e9cb4b8da2eaada285ca342f200ef8919c89f0f809831a20" Mar 10 00:38:47 crc kubenswrapper[4906]: I0310 00:38:47.505039 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02e13ca2234939a4e9cb4b8da2eaada285ca342f200ef8919c89f0f809831a20"} err="failed to get container status \"02e13ca2234939a4e9cb4b8da2eaada285ca342f200ef8919c89f0f809831a20\": rpc error: code = NotFound desc = could not find container \"02e13ca2234939a4e9cb4b8da2eaada285ca342f200ef8919c89f0f809831a20\": container with ID starting with 02e13ca2234939a4e9cb4b8da2eaada285ca342f200ef8919c89f0f809831a20 not found: ID does not exist" Mar 10 00:38:47 crc kubenswrapper[4906]: I0310 00:38:47.554109 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bc2480c1-8758-4811-b5a0-f4dcc7884a39-must-gather-output\") pod \"bc2480c1-8758-4811-b5a0-f4dcc7884a39\" (UID: \"bc2480c1-8758-4811-b5a0-f4dcc7884a39\") " Mar 10 00:38:47 crc kubenswrapper[4906]: I0310 00:38:47.554211 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8n9c\" (UniqueName: \"kubernetes.io/projected/bc2480c1-8758-4811-b5a0-f4dcc7884a39-kube-api-access-v8n9c\") pod \"bc2480c1-8758-4811-b5a0-f4dcc7884a39\" (UID: \"bc2480c1-8758-4811-b5a0-f4dcc7884a39\") " Mar 10 00:38:47 crc kubenswrapper[4906]: I0310 00:38:47.561081 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc2480c1-8758-4811-b5a0-f4dcc7884a39-kube-api-access-v8n9c" (OuterVolumeSpecName: "kube-api-access-v8n9c") pod "bc2480c1-8758-4811-b5a0-f4dcc7884a39" (UID: "bc2480c1-8758-4811-b5a0-f4dcc7884a39"). InnerVolumeSpecName "kube-api-access-v8n9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:38:47 crc kubenswrapper[4906]: I0310 00:38:47.622237 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc2480c1-8758-4811-b5a0-f4dcc7884a39-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "bc2480c1-8758-4811-b5a0-f4dcc7884a39" (UID: "bc2480c1-8758-4811-b5a0-f4dcc7884a39"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:38:47 crc kubenswrapper[4906]: I0310 00:38:47.656237 4906 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bc2480c1-8758-4811-b5a0-f4dcc7884a39-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 10 00:38:47 crc kubenswrapper[4906]: I0310 00:38:47.656267 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8n9c\" (UniqueName: \"kubernetes.io/projected/bc2480c1-8758-4811-b5a0-f4dcc7884a39-kube-api-access-v8n9c\") on node \"crc\" DevicePath \"\"" Mar 10 00:38:48 crc kubenswrapper[4906]: I0310 00:38:48.595163 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc2480c1-8758-4811-b5a0-f4dcc7884a39" path="/var/lib/kubelet/pods/bc2480c1-8758-4811-b5a0-f4dcc7884a39/volumes" Mar 10 00:39:30 crc kubenswrapper[4906]: I0310 00:39:30.502160 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:39:30 crc kubenswrapper[4906]: I0310 00:39:30.502814 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.149403 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551720-fp84x"] Mar 10 00:40:00 crc kubenswrapper[4906]: E0310 00:40:00.150472 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2480c1-8758-4811-b5a0-f4dcc7884a39" containerName="copy" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.150497 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2480c1-8758-4811-b5a0-f4dcc7884a39" containerName="copy" Mar 10 00:40:00 crc kubenswrapper[4906]: E0310 00:40:00.150541 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2480c1-8758-4811-b5a0-f4dcc7884a39" containerName="gather" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.150554 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2480c1-8758-4811-b5a0-f4dcc7884a39" containerName="gather" Mar 10 00:40:00 crc kubenswrapper[4906]: E0310 00:40:00.150578 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a11dfde4-da18-49df-80a2-0225adb53b76" containerName="extract-utilities" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.150594 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="a11dfde4-da18-49df-80a2-0225adb53b76" containerName="extract-utilities" Mar 10 00:40:00 crc kubenswrapper[4906]: E0310 00:40:00.150619 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a11dfde4-da18-49df-80a2-0225adb53b76" containerName="registry-server" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.150632 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="a11dfde4-da18-49df-80a2-0225adb53b76" containerName="registry-server" Mar 10 00:40:00 crc kubenswrapper[4906]: E0310 00:40:00.150704 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a11dfde4-da18-49df-80a2-0225adb53b76" containerName="extract-content" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.150718 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="a11dfde4-da18-49df-80a2-0225adb53b76" containerName="extract-content" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.150926 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc2480c1-8758-4811-b5a0-f4dcc7884a39" containerName="gather" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.150947 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="a11dfde4-da18-49df-80a2-0225adb53b76" containerName="registry-server" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.150983 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc2480c1-8758-4811-b5a0-f4dcc7884a39" containerName="copy" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.151756 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551720-fp84x" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.154578 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.156502 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.156533 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ktdr6" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.161289 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551720-fp84x"] Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.205986 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2gpj\" (UniqueName: \"kubernetes.io/projected/ad8ab589-e0dd-4b06-9503-242b82610ec4-kube-api-access-t2gpj\") pod \"auto-csr-approver-29551720-fp84x\" (UID: \"ad8ab589-e0dd-4b06-9503-242b82610ec4\") " pod="openshift-infra/auto-csr-approver-29551720-fp84x" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.312781 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2gpj\" (UniqueName: \"kubernetes.io/projected/ad8ab589-e0dd-4b06-9503-242b82610ec4-kube-api-access-t2gpj\") pod \"auto-csr-approver-29551720-fp84x\" (UID: \"ad8ab589-e0dd-4b06-9503-242b82610ec4\") " pod="openshift-infra/auto-csr-approver-29551720-fp84x" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.346016 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2gpj\" (UniqueName: \"kubernetes.io/projected/ad8ab589-e0dd-4b06-9503-242b82610ec4-kube-api-access-t2gpj\") pod \"auto-csr-approver-29551720-fp84x\" (UID: \"ad8ab589-e0dd-4b06-9503-242b82610ec4\") " pod="openshift-infra/auto-csr-approver-29551720-fp84x" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.488890 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551720-fp84x" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.502131 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.502172 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:40:00 crc kubenswrapper[4906]: I0310 00:40:00.772620 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551720-fp84x"] Mar 10 00:40:01 crc kubenswrapper[4906]: I0310 00:40:01.242625 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551720-fp84x" event={"ID":"ad8ab589-e0dd-4b06-9503-242b82610ec4","Type":"ContainerStarted","Data":"caa8b30208c053a1ca387d5e9f552ef6e085c56da16db242a9c311d3ca4b6f4b"} Mar 10 00:40:03 crc kubenswrapper[4906]: I0310 00:40:03.263570 4906 generic.go:334] "Generic (PLEG): container finished" podID="ad8ab589-e0dd-4b06-9503-242b82610ec4" containerID="ba1070d48404850cf895253e1c727d7b947ffbd3e6d17e94959d70cb01669d8e" exitCode=0 Mar 10 00:40:03 crc kubenswrapper[4906]: I0310 00:40:03.263666 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551720-fp84x" event={"ID":"ad8ab589-e0dd-4b06-9503-242b82610ec4","Type":"ContainerDied","Data":"ba1070d48404850cf895253e1c727d7b947ffbd3e6d17e94959d70cb01669d8e"} Mar 10 00:40:04 crc kubenswrapper[4906]: I0310 00:40:04.548593 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551720-fp84x" Mar 10 00:40:04 crc kubenswrapper[4906]: I0310 00:40:04.590734 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2gpj\" (UniqueName: \"kubernetes.io/projected/ad8ab589-e0dd-4b06-9503-242b82610ec4-kube-api-access-t2gpj\") pod \"ad8ab589-e0dd-4b06-9503-242b82610ec4\" (UID: \"ad8ab589-e0dd-4b06-9503-242b82610ec4\") " Mar 10 00:40:04 crc kubenswrapper[4906]: I0310 00:40:04.598103 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad8ab589-e0dd-4b06-9503-242b82610ec4-kube-api-access-t2gpj" (OuterVolumeSpecName: "kube-api-access-t2gpj") pod "ad8ab589-e0dd-4b06-9503-242b82610ec4" (UID: "ad8ab589-e0dd-4b06-9503-242b82610ec4"). InnerVolumeSpecName "kube-api-access-t2gpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:40:04 crc kubenswrapper[4906]: I0310 00:40:04.693516 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2gpj\" (UniqueName: \"kubernetes.io/projected/ad8ab589-e0dd-4b06-9503-242b82610ec4-kube-api-access-t2gpj\") on node \"crc\" DevicePath \"\"" Mar 10 00:40:05 crc kubenswrapper[4906]: I0310 00:40:05.282298 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551720-fp84x" event={"ID":"ad8ab589-e0dd-4b06-9503-242b82610ec4","Type":"ContainerDied","Data":"caa8b30208c053a1ca387d5e9f552ef6e085c56da16db242a9c311d3ca4b6f4b"} Mar 10 00:40:05 crc kubenswrapper[4906]: I0310 00:40:05.282369 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caa8b30208c053a1ca387d5e9f552ef6e085c56da16db242a9c311d3ca4b6f4b" Mar 10 00:40:05 crc kubenswrapper[4906]: I0310 00:40:05.282390 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551720-fp84x" Mar 10 00:40:05 crc kubenswrapper[4906]: I0310 00:40:05.639665 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551714-l284f"] Mar 10 00:40:05 crc kubenswrapper[4906]: I0310 00:40:05.649971 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551714-l284f"] Mar 10 00:40:06 crc kubenswrapper[4906]: I0310 00:40:06.598308 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0b1165-845d-47b5-b6f8-3d2519066da0" path="/var/lib/kubelet/pods/ce0b1165-845d-47b5-b6f8-3d2519066da0/volumes" Mar 10 00:40:29 crc kubenswrapper[4906]: I0310 00:40:29.330568 4906 scope.go:117] "RemoveContainer" containerID="2e98eb0d7fb03e6c6ca674cc935091d17b62749141537c4a23c135853bc606d8" Mar 10 00:40:30 crc kubenswrapper[4906]: I0310 00:40:30.501892 4906 patch_prober.go:28] interesting pod/machine-config-daemon-bxtw4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 00:40:30 crc kubenswrapper[4906]: I0310 00:40:30.502226 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 00:40:30 crc kubenswrapper[4906]: I0310 00:40:30.502280 4906 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" Mar 10 00:40:30 crc kubenswrapper[4906]: I0310 00:40:30.503199 4906 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"beb22ffdab30b1e8caaf3c4130c31ace21314768233d626cae4396452b5de532"} pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 00:40:30 crc kubenswrapper[4906]: I0310 00:40:30.503321 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" podUID="72d61d35-0a64-45a5-8df3-9c429727deba" containerName="machine-config-daemon" containerID="cri-o://beb22ffdab30b1e8caaf3c4130c31ace21314768233d626cae4396452b5de532" gracePeriod=600 Mar 10 00:40:31 crc kubenswrapper[4906]: I0310 00:40:31.554733 4906 generic.go:334] "Generic (PLEG): container finished" podID="72d61d35-0a64-45a5-8df3-9c429727deba" containerID="beb22ffdab30b1e8caaf3c4130c31ace21314768233d626cae4396452b5de532" exitCode=0 Mar 10 00:40:31 crc kubenswrapper[4906]: I0310 00:40:31.554842 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerDied","Data":"beb22ffdab30b1e8caaf3c4130c31ace21314768233d626cae4396452b5de532"} Mar 10 00:40:31 crc kubenswrapper[4906]: I0310 00:40:31.555044 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bxtw4" event={"ID":"72d61d35-0a64-45a5-8df3-9c429727deba","Type":"ContainerStarted","Data":"0d033e7ac7c9260d7ccca9f2edfe7d76ec55099d194329f7a80b0d29f3c00528"} Mar 10 00:40:31 crc kubenswrapper[4906]: I0310 00:40:31.555070 4906 scope.go:117] "RemoveContainer" containerID="3d03901045dd3898d4d4472388a38a1a6380b91b22032509551213e74b4671a3" Mar 10 00:40:53 crc kubenswrapper[4906]: I0310 00:40:53.383736 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sxr8g"] Mar 10 00:40:53 crc kubenswrapper[4906]: E0310 00:40:53.387155 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad8ab589-e0dd-4b06-9503-242b82610ec4" containerName="oc" Mar 10 00:40:53 crc kubenswrapper[4906]: I0310 00:40:53.387198 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad8ab589-e0dd-4b06-9503-242b82610ec4" containerName="oc" Mar 10 00:40:53 crc kubenswrapper[4906]: I0310 00:40:53.387410 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad8ab589-e0dd-4b06-9503-242b82610ec4" containerName="oc" Mar 10 00:40:53 crc kubenswrapper[4906]: I0310 00:40:53.388946 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sxr8g" Mar 10 00:40:53 crc kubenswrapper[4906]: I0310 00:40:53.394491 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sxr8g"] Mar 10 00:40:53 crc kubenswrapper[4906]: I0310 00:40:53.497553 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e03d7aa-9094-4f89-85af-4ebbbf50e388-catalog-content\") pod \"redhat-operators-sxr8g\" (UID: \"3e03d7aa-9094-4f89-85af-4ebbbf50e388\") " pod="openshift-marketplace/redhat-operators-sxr8g" Mar 10 00:40:53 crc kubenswrapper[4906]: I0310 00:40:53.497658 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-485jj\" (UniqueName: \"kubernetes.io/projected/3e03d7aa-9094-4f89-85af-4ebbbf50e388-kube-api-access-485jj\") pod \"redhat-operators-sxr8g\" (UID: \"3e03d7aa-9094-4f89-85af-4ebbbf50e388\") " pod="openshift-marketplace/redhat-operators-sxr8g" Mar 10 00:40:53 crc kubenswrapper[4906]: I0310 00:40:53.497893 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e03d7aa-9094-4f89-85af-4ebbbf50e388-utilities\") pod \"redhat-operators-sxr8g\" (UID: \"3e03d7aa-9094-4f89-85af-4ebbbf50e388\") " pod="openshift-marketplace/redhat-operators-sxr8g" Mar 10 00:40:53 crc kubenswrapper[4906]: I0310 00:40:53.598977 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-485jj\" (UniqueName: \"kubernetes.io/projected/3e03d7aa-9094-4f89-85af-4ebbbf50e388-kube-api-access-485jj\") pod \"redhat-operators-sxr8g\" (UID: \"3e03d7aa-9094-4f89-85af-4ebbbf50e388\") " pod="openshift-marketplace/redhat-operators-sxr8g" Mar 10 00:40:53 crc kubenswrapper[4906]: I0310 00:40:53.599057 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e03d7aa-9094-4f89-85af-4ebbbf50e388-utilities\") pod \"redhat-operators-sxr8g\" (UID: \"3e03d7aa-9094-4f89-85af-4ebbbf50e388\") " pod="openshift-marketplace/redhat-operators-sxr8g" Mar 10 00:40:53 crc kubenswrapper[4906]: I0310 00:40:53.599093 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e03d7aa-9094-4f89-85af-4ebbbf50e388-catalog-content\") pod \"redhat-operators-sxr8g\" (UID: \"3e03d7aa-9094-4f89-85af-4ebbbf50e388\") " pod="openshift-marketplace/redhat-operators-sxr8g" Mar 10 00:40:53 crc kubenswrapper[4906]: I0310 00:40:53.599661 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e03d7aa-9094-4f89-85af-4ebbbf50e388-catalog-content\") pod \"redhat-operators-sxr8g\" (UID: \"3e03d7aa-9094-4f89-85af-4ebbbf50e388\") " pod="openshift-marketplace/redhat-operators-sxr8g" Mar 10 00:40:53 crc kubenswrapper[4906]: I0310 00:40:53.599663 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e03d7aa-9094-4f89-85af-4ebbbf50e388-utilities\") pod \"redhat-operators-sxr8g\" (UID: \"3e03d7aa-9094-4f89-85af-4ebbbf50e388\") " pod="openshift-marketplace/redhat-operators-sxr8g" Mar 10 00:40:53 crc kubenswrapper[4906]: I0310 00:40:53.622987 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-485jj\" (UniqueName: \"kubernetes.io/projected/3e03d7aa-9094-4f89-85af-4ebbbf50e388-kube-api-access-485jj\") pod \"redhat-operators-sxr8g\" (UID: \"3e03d7aa-9094-4f89-85af-4ebbbf50e388\") " pod="openshift-marketplace/redhat-operators-sxr8g" Mar 10 00:40:53 crc kubenswrapper[4906]: I0310 00:40:53.709380 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sxr8g" Mar 10 00:40:53 crc kubenswrapper[4906]: I0310 00:40:53.964740 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sxr8g"] Mar 10 00:40:54 crc kubenswrapper[4906]: I0310 00:40:54.794118 4906 generic.go:334] "Generic (PLEG): container finished" podID="3e03d7aa-9094-4f89-85af-4ebbbf50e388" containerID="599f46ecef340d27644df1fece3d2421dbaf06dc4b3226d551e3f641b4f8a212" exitCode=0 Mar 10 00:40:54 crc kubenswrapper[4906]: I0310 00:40:54.794160 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxr8g" event={"ID":"3e03d7aa-9094-4f89-85af-4ebbbf50e388","Type":"ContainerDied","Data":"599f46ecef340d27644df1fece3d2421dbaf06dc4b3226d551e3f641b4f8a212"} Mar 10 00:40:54 crc kubenswrapper[4906]: I0310 00:40:54.794219 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxr8g" event={"ID":"3e03d7aa-9094-4f89-85af-4ebbbf50e388","Type":"ContainerStarted","Data":"28ca1088ec0ad520f6092bc9dea9862afc4fdfa9247900202856071ffb75585a"} Mar 10 00:40:55 crc kubenswrapper[4906]: I0310 00:40:55.805506 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxr8g" event={"ID":"3e03d7aa-9094-4f89-85af-4ebbbf50e388","Type":"ContainerStarted","Data":"776ddf10e0a9ea03e120908e3123920bf1b1ddddb2eab605437814cc00a40e2d"} Mar 10 00:40:56 crc kubenswrapper[4906]: I0310 00:40:56.814844 4906 generic.go:334] "Generic (PLEG): container finished" podID="3e03d7aa-9094-4f89-85af-4ebbbf50e388" containerID="776ddf10e0a9ea03e120908e3123920bf1b1ddddb2eab605437814cc00a40e2d" exitCode=0 Mar 10 00:40:56 crc kubenswrapper[4906]: I0310 00:40:56.814920 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxr8g" event={"ID":"3e03d7aa-9094-4f89-85af-4ebbbf50e388","Type":"ContainerDied","Data":"776ddf10e0a9ea03e120908e3123920bf1b1ddddb2eab605437814cc00a40e2d"} Mar 10 00:40:57 crc kubenswrapper[4906]: I0310 00:40:57.823367 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxr8g" event={"ID":"3e03d7aa-9094-4f89-85af-4ebbbf50e388","Type":"ContainerStarted","Data":"49aca979efa5cdc017deed9a6ba17b7e09a4bec4bd06b7870d6fa76497178c5e"} Mar 10 00:40:57 crc kubenswrapper[4906]: I0310 00:40:57.844388 4906 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sxr8g" podStartSLOduration=2.376269349 podStartE2EDuration="4.844366543s" podCreationTimestamp="2026-03-10 00:40:53 +0000 UTC" firstStartedPulling="2026-03-10 00:40:54.795643979 +0000 UTC m=+2080.943539091" lastFinishedPulling="2026-03-10 00:40:57.263741173 +0000 UTC m=+2083.411636285" observedRunningTime="2026-03-10 00:40:57.842049587 +0000 UTC m=+2083.989944699" watchObservedRunningTime="2026-03-10 00:40:57.844366543 +0000 UTC m=+2083.992261685" Mar 10 00:41:03 crc kubenswrapper[4906]: I0310 00:41:03.710226 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sxr8g" Mar 10 00:41:03 crc kubenswrapper[4906]: I0310 00:41:03.710617 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sxr8g" Mar 10 00:41:04 crc kubenswrapper[4906]: I0310 00:41:04.023953 4906 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sxr8g" Mar 10 00:41:04 crc kubenswrapper[4906]: I0310 00:41:04.064379 4906 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sxr8g" Mar 10 00:41:04 crc kubenswrapper[4906]: I0310 00:41:04.263160 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sxr8g"] Mar 10 00:41:05 crc kubenswrapper[4906]: I0310 00:41:05.900506 4906 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sxr8g" podUID="3e03d7aa-9094-4f89-85af-4ebbbf50e388" containerName="registry-server" containerID="cri-o://49aca979efa5cdc017deed9a6ba17b7e09a4bec4bd06b7870d6fa76497178c5e" gracePeriod=2 Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.323835 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sxr8g" Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.493717 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-485jj\" (UniqueName: \"kubernetes.io/projected/3e03d7aa-9094-4f89-85af-4ebbbf50e388-kube-api-access-485jj\") pod \"3e03d7aa-9094-4f89-85af-4ebbbf50e388\" (UID: \"3e03d7aa-9094-4f89-85af-4ebbbf50e388\") " Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.493883 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e03d7aa-9094-4f89-85af-4ebbbf50e388-utilities\") pod \"3e03d7aa-9094-4f89-85af-4ebbbf50e388\" (UID: \"3e03d7aa-9094-4f89-85af-4ebbbf50e388\") " Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.493948 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e03d7aa-9094-4f89-85af-4ebbbf50e388-catalog-content\") pod \"3e03d7aa-9094-4f89-85af-4ebbbf50e388\" (UID: \"3e03d7aa-9094-4f89-85af-4ebbbf50e388\") " Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.496085 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e03d7aa-9094-4f89-85af-4ebbbf50e388-utilities" (OuterVolumeSpecName: "utilities") pod "3e03d7aa-9094-4f89-85af-4ebbbf50e388" (UID: "3e03d7aa-9094-4f89-85af-4ebbbf50e388"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.500106 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e03d7aa-9094-4f89-85af-4ebbbf50e388-kube-api-access-485jj" (OuterVolumeSpecName: "kube-api-access-485jj") pod "3e03d7aa-9094-4f89-85af-4ebbbf50e388" (UID: "3e03d7aa-9094-4f89-85af-4ebbbf50e388"). InnerVolumeSpecName "kube-api-access-485jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.595585 4906 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e03d7aa-9094-4f89-85af-4ebbbf50e388-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.595626 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-485jj\" (UniqueName: \"kubernetes.io/projected/3e03d7aa-9094-4f89-85af-4ebbbf50e388-kube-api-access-485jj\") on node \"crc\" DevicePath \"\"" Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.918997 4906 generic.go:334] "Generic (PLEG): container finished" podID="3e03d7aa-9094-4f89-85af-4ebbbf50e388" containerID="49aca979efa5cdc017deed9a6ba17b7e09a4bec4bd06b7870d6fa76497178c5e" exitCode=0 Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.919046 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxr8g" event={"ID":"3e03d7aa-9094-4f89-85af-4ebbbf50e388","Type":"ContainerDied","Data":"49aca979efa5cdc017deed9a6ba17b7e09a4bec4bd06b7870d6fa76497178c5e"} Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.919057 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sxr8g" Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.919079 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxr8g" event={"ID":"3e03d7aa-9094-4f89-85af-4ebbbf50e388","Type":"ContainerDied","Data":"28ca1088ec0ad520f6092bc9dea9862afc4fdfa9247900202856071ffb75585a"} Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.919120 4906 scope.go:117] "RemoveContainer" containerID="49aca979efa5cdc017deed9a6ba17b7e09a4bec4bd06b7870d6fa76497178c5e" Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.935596 4906 scope.go:117] "RemoveContainer" containerID="776ddf10e0a9ea03e120908e3123920bf1b1ddddb2eab605437814cc00a40e2d" Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.953779 4906 scope.go:117] "RemoveContainer" containerID="599f46ecef340d27644df1fece3d2421dbaf06dc4b3226d551e3f641b4f8a212" Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.982288 4906 scope.go:117] "RemoveContainer" containerID="49aca979efa5cdc017deed9a6ba17b7e09a4bec4bd06b7870d6fa76497178c5e" Mar 10 00:41:06 crc kubenswrapper[4906]: E0310 00:41:06.982868 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49aca979efa5cdc017deed9a6ba17b7e09a4bec4bd06b7870d6fa76497178c5e\": container with ID starting with 49aca979efa5cdc017deed9a6ba17b7e09a4bec4bd06b7870d6fa76497178c5e not found: ID does not exist" containerID="49aca979efa5cdc017deed9a6ba17b7e09a4bec4bd06b7870d6fa76497178c5e" Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.982911 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49aca979efa5cdc017deed9a6ba17b7e09a4bec4bd06b7870d6fa76497178c5e"} err="failed to get container status \"49aca979efa5cdc017deed9a6ba17b7e09a4bec4bd06b7870d6fa76497178c5e\": rpc error: code = NotFound desc = could not find container \"49aca979efa5cdc017deed9a6ba17b7e09a4bec4bd06b7870d6fa76497178c5e\": container with ID starting with 49aca979efa5cdc017deed9a6ba17b7e09a4bec4bd06b7870d6fa76497178c5e not found: ID does not exist" Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.982937 4906 scope.go:117] "RemoveContainer" containerID="776ddf10e0a9ea03e120908e3123920bf1b1ddddb2eab605437814cc00a40e2d" Mar 10 00:41:06 crc kubenswrapper[4906]: E0310 00:41:06.983498 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"776ddf10e0a9ea03e120908e3123920bf1b1ddddb2eab605437814cc00a40e2d\": container with ID starting with 776ddf10e0a9ea03e120908e3123920bf1b1ddddb2eab605437814cc00a40e2d not found: ID does not exist" containerID="776ddf10e0a9ea03e120908e3123920bf1b1ddddb2eab605437814cc00a40e2d" Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.983553 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"776ddf10e0a9ea03e120908e3123920bf1b1ddddb2eab605437814cc00a40e2d"} err="failed to get container status \"776ddf10e0a9ea03e120908e3123920bf1b1ddddb2eab605437814cc00a40e2d\": rpc error: code = NotFound desc = could not find container \"776ddf10e0a9ea03e120908e3123920bf1b1ddddb2eab605437814cc00a40e2d\": container with ID starting with 776ddf10e0a9ea03e120908e3123920bf1b1ddddb2eab605437814cc00a40e2d not found: ID does not exist" Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.983585 4906 scope.go:117] "RemoveContainer" containerID="599f46ecef340d27644df1fece3d2421dbaf06dc4b3226d551e3f641b4f8a212" Mar 10 00:41:06 crc kubenswrapper[4906]: E0310 00:41:06.983994 4906 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"599f46ecef340d27644df1fece3d2421dbaf06dc4b3226d551e3f641b4f8a212\": container with ID starting with 599f46ecef340d27644df1fece3d2421dbaf06dc4b3226d551e3f641b4f8a212 not found: ID does not exist" containerID="599f46ecef340d27644df1fece3d2421dbaf06dc4b3226d551e3f641b4f8a212" Mar 10 00:41:06 crc kubenswrapper[4906]: I0310 00:41:06.984023 4906 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599f46ecef340d27644df1fece3d2421dbaf06dc4b3226d551e3f641b4f8a212"} err="failed to get container status \"599f46ecef340d27644df1fece3d2421dbaf06dc4b3226d551e3f641b4f8a212\": rpc error: code = NotFound desc = could not find container \"599f46ecef340d27644df1fece3d2421dbaf06dc4b3226d551e3f641b4f8a212\": container with ID starting with 599f46ecef340d27644df1fece3d2421dbaf06dc4b3226d551e3f641b4f8a212 not found: ID does not exist" Mar 10 00:41:07 crc kubenswrapper[4906]: I0310 00:41:07.422707 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e03d7aa-9094-4f89-85af-4ebbbf50e388-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e03d7aa-9094-4f89-85af-4ebbbf50e388" (UID: "3e03d7aa-9094-4f89-85af-4ebbbf50e388"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 00:41:07 crc kubenswrapper[4906]: I0310 00:41:07.428495 4906 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e03d7aa-9094-4f89-85af-4ebbbf50e388-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 00:41:07 crc kubenswrapper[4906]: I0310 00:41:07.563762 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sxr8g"] Mar 10 00:41:07 crc kubenswrapper[4906]: I0310 00:41:07.585353 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sxr8g"] Mar 10 00:41:08 crc kubenswrapper[4906]: I0310 00:41:08.595172 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e03d7aa-9094-4f89-85af-4ebbbf50e388" path="/var/lib/kubelet/pods/3e03d7aa-9094-4f89-85af-4ebbbf50e388/volumes" Mar 10 00:41:21 crc kubenswrapper[4906]: I0310 00:41:21.617744 4906 patch_prober.go:28] interesting pod/router-default-5444994796-kpmwl container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 00:41:21 crc kubenswrapper[4906]: I0310 00:41:21.618254 4906 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-kpmwl" podUID="6bbf8e83-d583-49de-a12c-6f0a3953dc67" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 00:42:00 crc kubenswrapper[4906]: I0310 00:42:00.135888 4906 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551722-9cdkw"] Mar 10 00:42:00 crc kubenswrapper[4906]: E0310 00:42:00.136840 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e03d7aa-9094-4f89-85af-4ebbbf50e388" containerName="registry-server" Mar 10 00:42:00 crc kubenswrapper[4906]: I0310 00:42:00.136854 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e03d7aa-9094-4f89-85af-4ebbbf50e388" containerName="registry-server" Mar 10 00:42:00 crc kubenswrapper[4906]: E0310 00:42:00.136875 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e03d7aa-9094-4f89-85af-4ebbbf50e388" containerName="extract-content" Mar 10 00:42:00 crc kubenswrapper[4906]: I0310 00:42:00.136881 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e03d7aa-9094-4f89-85af-4ebbbf50e388" containerName="extract-content" Mar 10 00:42:00 crc kubenswrapper[4906]: E0310 00:42:00.136892 4906 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e03d7aa-9094-4f89-85af-4ebbbf50e388" containerName="extract-utilities" Mar 10 00:42:00 crc kubenswrapper[4906]: I0310 00:42:00.136899 4906 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e03d7aa-9094-4f89-85af-4ebbbf50e388" containerName="extract-utilities" Mar 10 00:42:00 crc kubenswrapper[4906]: I0310 00:42:00.137010 4906 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e03d7aa-9094-4f89-85af-4ebbbf50e388" containerName="registry-server" Mar 10 00:42:00 crc kubenswrapper[4906]: I0310 00:42:00.137440 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551722-9cdkw" Mar 10 00:42:00 crc kubenswrapper[4906]: I0310 00:42:00.140832 4906 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-ktdr6" Mar 10 00:42:00 crc kubenswrapper[4906]: I0310 00:42:00.142105 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 00:42:00 crc kubenswrapper[4906]: I0310 00:42:00.142253 4906 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 00:42:00 crc kubenswrapper[4906]: I0310 00:42:00.144345 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551722-9cdkw"] Mar 10 00:42:00 crc kubenswrapper[4906]: I0310 00:42:00.157572 4906 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5fs8\" (UniqueName: \"kubernetes.io/projected/91133f1a-a431-48ba-b8ee-ef8d3f3e2f35-kube-api-access-s5fs8\") pod \"auto-csr-approver-29551722-9cdkw\" (UID: \"91133f1a-a431-48ba-b8ee-ef8d3f3e2f35\") " pod="openshift-infra/auto-csr-approver-29551722-9cdkw" Mar 10 00:42:00 crc kubenswrapper[4906]: I0310 00:42:00.258672 4906 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5fs8\" (UniqueName: \"kubernetes.io/projected/91133f1a-a431-48ba-b8ee-ef8d3f3e2f35-kube-api-access-s5fs8\") pod \"auto-csr-approver-29551722-9cdkw\" (UID: \"91133f1a-a431-48ba-b8ee-ef8d3f3e2f35\") " pod="openshift-infra/auto-csr-approver-29551722-9cdkw" Mar 10 00:42:00 crc kubenswrapper[4906]: I0310 00:42:00.286912 4906 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5fs8\" (UniqueName: \"kubernetes.io/projected/91133f1a-a431-48ba-b8ee-ef8d3f3e2f35-kube-api-access-s5fs8\") pod \"auto-csr-approver-29551722-9cdkw\" (UID: \"91133f1a-a431-48ba-b8ee-ef8d3f3e2f35\") " pod="openshift-infra/auto-csr-approver-29551722-9cdkw" Mar 10 00:42:00 crc kubenswrapper[4906]: I0310 00:42:00.454531 4906 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551722-9cdkw" Mar 10 00:42:00 crc kubenswrapper[4906]: I0310 00:42:00.946025 4906 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551722-9cdkw"] Mar 10 00:42:01 crc kubenswrapper[4906]: I0310 00:42:01.453656 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551722-9cdkw" event={"ID":"91133f1a-a431-48ba-b8ee-ef8d3f3e2f35","Type":"ContainerStarted","Data":"cd42b615d08717f52d5a064c1f6ccd7a6c0d64efd79c36d8f8d34c1fe00ce214"} Mar 10 00:42:02 crc kubenswrapper[4906]: I0310 00:42:02.462239 4906 generic.go:334] "Generic (PLEG): container finished" podID="91133f1a-a431-48ba-b8ee-ef8d3f3e2f35" containerID="56601e0f6c03916efae075fdccc9dcb371cce024f898f5d98fa3744add8aa1a1" exitCode=0 Mar 10 00:42:02 crc kubenswrapper[4906]: I0310 00:42:02.462301 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551722-9cdkw" event={"ID":"91133f1a-a431-48ba-b8ee-ef8d3f3e2f35","Type":"ContainerDied","Data":"56601e0f6c03916efae075fdccc9dcb371cce024f898f5d98fa3744add8aa1a1"} Mar 10 00:42:03 crc kubenswrapper[4906]: I0310 00:42:03.732498 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551722-9cdkw" Mar 10 00:42:03 crc kubenswrapper[4906]: I0310 00:42:03.907895 4906 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5fs8\" (UniqueName: \"kubernetes.io/projected/91133f1a-a431-48ba-b8ee-ef8d3f3e2f35-kube-api-access-s5fs8\") pod \"91133f1a-a431-48ba-b8ee-ef8d3f3e2f35\" (UID: \"91133f1a-a431-48ba-b8ee-ef8d3f3e2f35\") " Mar 10 00:42:03 crc kubenswrapper[4906]: I0310 00:42:03.912739 4906 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91133f1a-a431-48ba-b8ee-ef8d3f3e2f35-kube-api-access-s5fs8" (OuterVolumeSpecName: "kube-api-access-s5fs8") pod "91133f1a-a431-48ba-b8ee-ef8d3f3e2f35" (UID: "91133f1a-a431-48ba-b8ee-ef8d3f3e2f35"). InnerVolumeSpecName "kube-api-access-s5fs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 00:42:04 crc kubenswrapper[4906]: I0310 00:42:04.010320 4906 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5fs8\" (UniqueName: \"kubernetes.io/projected/91133f1a-a431-48ba-b8ee-ef8d3f3e2f35-kube-api-access-s5fs8\") on node \"crc\" DevicePath \"\"" Mar 10 00:42:04 crc kubenswrapper[4906]: I0310 00:42:04.479193 4906 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551722-9cdkw" event={"ID":"91133f1a-a431-48ba-b8ee-ef8d3f3e2f35","Type":"ContainerDied","Data":"cd42b615d08717f52d5a064c1f6ccd7a6c0d64efd79c36d8f8d34c1fe00ce214"} Mar 10 00:42:04 crc kubenswrapper[4906]: I0310 00:42:04.479239 4906 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd42b615d08717f52d5a064c1f6ccd7a6c0d64efd79c36d8f8d34c1fe00ce214" Mar 10 00:42:04 crc kubenswrapper[4906]: I0310 00:42:04.479282 4906 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551722-9cdkw" Mar 10 00:42:04 crc kubenswrapper[4906]: I0310 00:42:04.789808 4906 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551716-kmxb7"] Mar 10 00:42:04 crc kubenswrapper[4906]: I0310 00:42:04.794707 4906 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551716-kmxb7"] Mar 10 00:42:06 crc kubenswrapper[4906]: I0310 00:42:06.595801 4906 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="175ba419-d912-49ca-8706-4e6b5ca2eeca" path="/var/lib/kubelet/pods/175ba419-d912-49ca-8706-4e6b5ca2eeca/volumes"